+ export WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release + WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release + [[ k8s-1.11.0-release =~ openshift-.* ]] + [[ k8s-1.11.0-release =~ .*-1.9.3-.* ]] + export KUBEVIRT_PROVIDER=k8s-1.10.3 + KUBEVIRT_PROVIDER=k8s-1.10.3 + export KUBEVIRT_NUM_NODES=2 + KUBEVIRT_NUM_NODES=2 + export NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + export NAMESPACE=kube-system + NAMESPACE=kube-system + trap '{ make cluster-down; }' EXIT SIGINT SIGTERM SIGSTOP + make cluster-down ./cluster/down.sh + make cluster-up ./cluster/up.sh Downloading ....... Downloading ....... 2018/07/30 08:55:48 Waiting for host: 192.168.66.101:22 2018/07/30 08:55:51 Problem with dial: dial tcp 192.168.66.101:22: getsockopt: no route to host. Sleeping 5s 2018/07/30 08:55:59 Problem with dial: dial tcp 192.168.66.101:22: getsockopt: no route to host. Sleeping 5s 2018/07/30 08:56:04 Problem with dial: dial tcp 192.168.66.101:22: getsockopt: connection refused. Sleeping 5s 2018/07/30 08:56:09 Connected to tcp://192.168.66.101:22 + kubeadm init --config /etc/kubernetes/kubeadm.conf [init] Using Kubernetes version: v1.10.3 [init] Using Authorization modes: [Node RBAC] [preflight] Running pre-flight checks. [WARNING FileExisting-crictl]: crictl not found in system path Suggestion: go get github.com/kubernetes-incubator/cri-tools/cmd/crictl Flag --admission-control has been deprecated, Use --enable-admission-plugins or --disable-admission-plugins instead. Will be removed in a future version. [certificates] Generated ca certificate and key. [certificates] Generated apiserver certificate and key. [certificates] apiserver serving cert is signed for DNS names [node01 kubernetes kubernetes.default kubernetes.default.svc kubernetes.default.svc.cluster.local] and IPs [10.96.0.1 192.168.66.101] [certificates] Generated apiserver-kubelet-client certificate and key. [certificates] Generated etcd/ca certificate and key. [certificates] Generated etcd/server certificate and key. [certificates] etcd/server serving cert is signed for DNS names [localhost] and IPs [127.0.0.1] [certificates] Generated etcd/peer certificate and key. [certificates] etcd/peer serving cert is signed for DNS names [node01] and IPs [192.168.66.101] [certificates] Generated etcd/healthcheck-client certificate and key. [certificates] Generated apiserver-etcd-client certificate and key. [certificates] Generated sa key and public key. [certificates] Generated front-proxy-ca certificate and key. [certificates] Generated front-proxy-client certificate and key. [certificates] Valid certificates and keys now exist in "/etc/kubernetes/pki" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/admin.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/kubelet.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/controller-manager.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/scheduler.conf" [controlplane] Wrote Static Pod manifest for component kube-apiserver to "/etc/kubernetes/manifests/kube-apiserver.yaml" [controlplane] Wrote Static Pod manifest for component kube-controller-manager to "/etc/kubernetes/manifests/kube-controller-manager.yaml" [controlplane] Wrote Static Pod manifest for component kube-scheduler to "/etc/kubernetes/manifests/kube-scheduler.yaml" [etcd] Wrote Static Pod manifest for a local etcd instance to "/etc/kubernetes/manifests/etcd.yaml" [init] Waiting for the kubelet to boot up the control plane as Static Pods from directory "/etc/kubernetes/manifests". [init] This might take a minute or longer if the control plane images have to be pulled. [apiclient] All control plane components are healthy after 27.005829 seconds [uploadconfig] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace [markmaster] Will mark node node01 as master by adding a label and a taint [markmaster] Master node01 tainted and labelled with key/value: node-role.kubernetes.io/master="" [bootstraptoken] Using token: abcdef.1234567890123456 [bootstraptoken] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials [bootstraptoken] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token [bootstraptoken] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster [bootstraptoken] Creating the "cluster-info" ConfigMap in the "kube-public" namespace [addons] Applied essential addon: kube-dns [addons] Applied essential addon: kube-proxy Your Kubernetes master has initialized successfully! To start using your cluster, you need to run the following as a regular user: mkdir -p $HOME/.kube sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config sudo chown $(id -u):$(id -g) $HOME/.kube/config You should now deploy a pod network to the cluster. Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at: https://kubernetes.io/docs/concepts/cluster-administration/addons/ You can now join any number of machines by running the following on each node as root: kubeadm join 192.168.66.101:6443 --token abcdef.1234567890123456 --discovery-token-ca-cert-hash sha256:55a0772a7528d03319bc44c33dc4677fe21ed8f70824e547ee17c7c66d2908f8 + kubectl --kubeconfig=/etc/kubernetes/admin.conf apply -f https://raw.githubusercontent.com/coreos/flannel/v0.9.1/Documentation/kube-flannel.yml clusterrole.rbac.authorization.k8s.io "flannel" created clusterrolebinding.rbac.authorization.k8s.io "flannel" created serviceaccount "flannel" created configmap "kube-flannel-cfg" created daemonset.extensions "kube-flannel-ds" created + kubectl --kubeconfig=/etc/kubernetes/admin.conf taint nodes node01 node-role.kubernetes.io/master:NoSchedule- node "node01" untainted 2018/07/30 08:56:53 Waiting for host: 192.168.66.102:22 2018/07/30 08:56:56 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: no route to host. Sleeping 5s 2018/07/30 08:57:04 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: no route to host. Sleeping 5s 2018/07/30 08:57:09 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: connection refused. Sleeping 5s 2018/07/30 08:57:14 Connected to tcp://192.168.66.102:22 + kubeadm join --token abcdef.1234567890123456 192.168.66.101:6443 --ignore-preflight-errors=all --discovery-token-unsafe-skip-ca-verification=true [preflight] Running pre-flight checks. [discovery] Trying to connect to API Server "192.168.66.101:6443" [WARNING FileExisting-crictl]: crictl not found in system path Suggestion: go get github.com/kubernetes-incubator/cri-tools/cmd/crictl [discovery] Created cluster-info discovery client, requesting info from "https://192.168.66.101:6443" [discovery] Cluster info signature and contents are valid and no TLS pinning was specified, will use API Server "192.168.66.101:6443" [discovery] Successfully established connection with API Server "192.168.66.101:6443" This node has joined the cluster: * Certificate signing request was sent to master and a response was received. * The Kubelet was informed of the new secure connection details. Run 'kubectl get nodes' on the master to see this node join the cluster. Sending file modes: C0755 39588992 kubectl Sending file modes: C0600 5454 admin.conf Cluster "kubernetes" set. Cluster "kubernetes" set. + set +e + kubectl get nodes --no-headers + cluster/kubectl.sh get nodes --no-headers node01 Ready master 54s v1.10.3 node02 Ready 13s v1.10.3 + kubectl_rc=0 + '[' 0 -ne 0 ']' ++ kubectl get nodes --no-headers ++ cluster/kubectl.sh get nodes --no-headers ++ grep NotReady + '[' -n '' ']' + set -e + echo 'Nodes are ready:' Nodes are ready: + kubectl get nodes + cluster/kubectl.sh get nodes NAME STATUS ROLES AGE VERSION node01 Ready master 56s v1.10.3 node02 Ready 15s v1.10.3 + make cluster-sync ./cluster/build.sh Building ... sha256:dcf2b21fa2ed11dcf9dbba21b1cca0ee3fad521a0e9aee61c06d0b0b66a4b200 go version go1.10 linux/amd64 go version go1.10 linux/amd64 make[1]: Entering directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt' hack/dockerized "./hack/check.sh && KUBEVIRT_VERSION= ./hack/build-go.sh install " && ./hack/build-copy-artifacts.sh sha256:dcf2b21fa2ed11dcf9dbba21b1cca0ee3fad521a0e9aee61c06d0b0b66a4b200 go version go1.10 linux/amd64 go version go1.10 linux/amd64 find: '/root/go/src/kubevirt.io/kubevirt/_out/cmd': No such file or directory Compiling tests... compiled tests.test hack/build-docker.sh build Sending build context to Docker daemon 40.39 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-controller ---> Using cache ---> b00c84523b53 Step 4/8 : WORKDIR /home/virt-controller ---> Using cache ---> b76b8bd8cd39 Step 5/8 : USER 1001 ---> Using cache ---> b6d9ad9ed232 Step 6/8 : COPY virt-controller /usr/bin/virt-controller ---> Using cache ---> 2e13773a0cd7 Step 7/8 : ENTRYPOINT /usr/bin/virt-controller ---> Using cache ---> 9895842547f9 Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release2" '' "virt-controller" '' ---> Running in a7e99efbdde8 ---> 34ce88b06ce6 Removing intermediate container a7e99efbdde8 Successfully built 34ce88b06ce6 Sending build context to Docker daemon 43.31 MB Step 1/9 : FROM kubevirt/libvirt:4.2.0 ---> 5f0bfe81a3e0 Step 2/9 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 945996802736 Step 3/9 : RUN dnf -y install socat genisoimage util-linux libcgroup-tools ethtool net-tools sudo && dnf -y clean all && test $(id -u qemu) = 107 # make sure that the qemu user really is 107 ---> Using cache ---> 672f9ab56316 Step 4/9 : COPY virt-launcher /usr/bin/virt-launcher ---> Using cache ---> aea1b3ffeae0 Step 5/9 : RUN setcap CAP_NET_BIND_SERVICE=+eip /usr/bin/qemu-system-x86_64 ---> Using cache ---> c749bf4ef1d0 Step 6/9 : RUN mkdir -p /usr/share/kubevirt/virt-launcher ---> Using cache ---> 7dc5a428196a Step 7/9 : COPY sock-connector /usr/share/kubevirt/virt-launcher/ ---> Using cache ---> 8a182b35b2e8 Step 8/9 : ENTRYPOINT /usr/bin/virt-launcher ---> Using cache ---> 9d20c7fbf846 Step 9/9 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release2" '' "virt-launcher" '' ---> Running in 122e390270af ---> ba5acda53340 Removing intermediate container 122e390270af Successfully built ba5acda53340 Sending build context to Docker daemon 41.68 MB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/5 : COPY virt-handler /usr/bin/virt-handler ---> Using cache ---> 8704f5221ac6 Step 4/5 : ENTRYPOINT /usr/bin/virt-handler ---> Using cache ---> 71dcbce8b450 Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release2" '' "virt-handler" '' ---> Running in 4de583eb30f1 ---> c73641f806ba Removing intermediate container 4de583eb30f1 Successfully built c73641f806ba Sending build context to Docker daemon 38.81 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-api ---> Using cache ---> ed1ebf600ee1 Step 4/8 : WORKDIR /home/virt-api ---> Using cache ---> 0769dad023e5 Step 5/8 : USER 1001 ---> Using cache ---> 0cb65afb0c2b Step 6/8 : COPY virt-api /usr/bin/virt-api ---> 4336681d942e Removing intermediate container f8e96dab4ff3 Step 7/8 : ENTRYPOINT /usr/bin/virt-api ---> Running in 12885d8f8999 ---> bb7157e433f4 Removing intermediate container 12885d8f8999 Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release2" '' "virt-api" '' ---> Running in a5bf47af1d23 ---> 23ca4c44cda3 Removing intermediate container a5bf47af1d23 Successfully built 23ca4c44cda3 Sending build context to Docker daemon 4.096 kB Step 1/7 : FROM fedora:28 ---> cc510acfcd70 Step 2/7 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/7 : ENV container docker ---> Using cache ---> 62847a2a1fa8 Step 4/7 : RUN mkdir -p /images/custom /images/alpine && truncate -s 64M /images/custom/disk.img && curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /images/alpine/disk.img ---> Using cache ---> 02134835a6aa Step 5/7 : ADD entrypoint.sh / ---> Using cache ---> ec0843818da7 Step 6/7 : CMD /entrypoint.sh ---> Using cache ---> 754029bb4bd2 Step 7/7 : LABEL "disks-images-provider" '' "kubevirt-functional-tests-k8s-1.11.0-release2" '' ---> Using cache ---> 1220ce6ff0fa Successfully built 1220ce6ff0fa Sending build context to Docker daemon 2.56 kB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/5 : ENV container docker ---> Using cache ---> 62847a2a1fa8 Step 4/5 : RUN dnf -y install procps-ng nmap-ncat && dnf -y clean all ---> Using cache ---> 207487abe7b2 Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release2" '' "vm-killer" '' ---> Using cache ---> e2940dd6b38f Successfully built e2940dd6b38f Sending build context to Docker daemon 5.12 kB Step 1/7 : FROM debian:sid ---> 68f33cf86aab Step 2/7 : MAINTAINER "David Vossel" \ ---> Using cache ---> 5734d749eb5c Step 3/7 : ENV container docker ---> Using cache ---> f8775a77966f Step 4/7 : RUN apt-get update && apt-get install -y bash curl bzip2 qemu-utils && mkdir -p /disk && rm -rf /var/lib/apt/lists/* ---> Using cache ---> 1a40cf222a61 Step 5/7 : ADD entry-point.sh / ---> Using cache ---> 77b545d92fe7 Step 6/7 : CMD /entry-point.sh ---> Using cache ---> dfe20d463305 Step 7/7 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release2" '' "registry-disk-v1alpha" '' ---> Using cache ---> b90f9ac6e4b9 Successfully built b90f9ac6e4b9 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33139/kubevirt/registry-disk-v1alpha:devel ---> b90f9ac6e4b9 Step 2/4 : MAINTAINER "David Vossel" \ ---> Using cache ---> bf4321f1bdcf Step 3/4 : RUN curl https://download.cirros-cloud.net/0.4.0/cirros-0.4.0-x86_64-disk.img > /disk/cirros.img ---> Using cache ---> fdb5aa18f4f6 Step 4/4 : LABEL "cirros-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.11.0-release2" '' ---> Using cache ---> a3970deead12 Successfully built a3970deead12 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33139/kubevirt/registry-disk-v1alpha:devel ---> b90f9ac6e4b9 Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 3fbeaa31b861 Step 3/4 : RUN curl -g -L https://download.fedoraproject.org/pub/fedora/linux/releases/27/CloudImages/x86_64/images/Fedora-Cloud-Base-27-1.6.x86_64.qcow2 > /disk/fedora.qcow2 ---> Using cache ---> 2f8d65aae622 Step 4/4 : LABEL "fedora-cloud-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.11.0-release2" '' ---> Using cache ---> 8e0988b9a102 Successfully built 8e0988b9a102 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33139/kubevirt/registry-disk-v1alpha:devel ---> b90f9ac6e4b9 Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 3fbeaa31b861 Step 3/4 : RUN curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /disk/alpine.iso ---> Using cache ---> 61427d5da613 Step 4/4 : LABEL "alpine-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.11.0-release2" '' ---> Using cache ---> 64c1d6221966 Successfully built 64c1d6221966 Sending build context to Docker daemon 35.59 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virtctl ---> Using cache ---> 985fe391c056 Step 4/8 : WORKDIR /home/virtctl ---> Using cache ---> 3b2cae8ac543 Step 5/8 : USER 1001 ---> Using cache ---> 0c06e5b4a900 Step 6/8 : COPY subresource-access-test /subresource-access-test ---> 28bf78e4dda0 Removing intermediate container d39073d5cd6e Step 7/8 : ENTRYPOINT /subresource-access-test ---> Running in f190fa6cf777 ---> 42d40fb72877 Removing intermediate container f190fa6cf777 Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release2" '' "subresource-access-test" '' ---> Running in 1d1753c27dcd ---> 8acea7e837e4 Removing intermediate container 1d1753c27dcd Successfully built 8acea7e837e4 Sending build context to Docker daemon 3.072 kB Step 1/9 : FROM fedora:28 ---> cc510acfcd70 Step 2/9 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/9 : ENV container docker ---> Using cache ---> 62847a2a1fa8 Step 4/9 : RUN dnf -y install make git gcc && dnf -y clean all ---> Using cache ---> d3456b1644b1 Step 5/9 : ENV GIMME_GO_VERSION 1.9.2 ---> Using cache ---> 0ba81fddbba1 Step 6/9 : RUN mkdir -p /gimme && curl -sL https://raw.githubusercontent.com/travis-ci/gimme/master/gimme | HOME=/gimme bash >> /etc/profile.d/gimme.sh ---> Using cache ---> 5d33abe3f819 Step 7/9 : ENV GOPATH "/go" GOBIN "/usr/bin" ---> Using cache ---> 783826523be1 Step 8/9 : RUN mkdir -p /go && source /etc/profile.d/gimme.sh && go get github.com/masterzen/winrm-cli ---> Using cache ---> 711bc8d15952 Step 9/9 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release2" '' "winrmcli" '' ---> Using cache ---> b4a60e1f700b Successfully built b4a60e1f700b Sending build context to Docker daemon 36.8 MB Step 1/5 : FROM fedora:27 ---> 9110ae7f579f Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> e3238544ad97 Step 3/5 : COPY example-hook-sidecar /example-hook-sidecar ---> 26e3b94713bb Removing intermediate container 271dec845ca6 Step 4/5 : ENTRYPOINT /example-hook-sidecar ---> Running in 46aa4714dc8f ---> a21adf637a08 Removing intermediate container 46aa4714dc8f Step 5/5 : LABEL "example-hook-sidecar" '' "kubevirt-functional-tests-k8s-1.11.0-release2" '' ---> Running in d9b140756040 ---> a295e256321e Removing intermediate container d9b140756040 Successfully built a295e256321e hack/build-docker.sh push The push refers to a repository [localhost:33139/kubevirt/virt-controller] 1dd2243f20c7: Preparing aa89340cf7a8: Preparing 891e1e4ef82a: Preparing aa89340cf7a8: Pushed 1dd2243f20c7: Pushed 891e1e4ef82a: Pushed devel: digest: sha256:61164cfe9e041061f54213d318d8cba9580c930764c94f0656d7064f79cfafa6 size: 949 The push refers to a repository [localhost:33139/kubevirt/virt-launcher] 83b34c9602df: Preparing 32cd7de8f811: Preparing 746d0ab94251: Preparing ff60b29a54a3: Preparing 633427c64a24: Preparing 83b34c9602df: Waiting da38cf808aa5: Preparing 746d0ab94251: Waiting b83399358a92: Preparing 186d8b3e4fd8: Preparing 633427c64a24: Waiting fa6154170bf5: Preparing 5eefb9960a36: Preparing 32cd7de8f811: Waiting 891e1e4ef82a: Preparing fa6154170bf5: Waiting 891e1e4ef82a: Waiting 5eefb9960a36: Waiting 83b34c9602df: Pushed 32cd7de8f811: Pushed 746d0ab94251: Pushed da38cf808aa5: Pushed b83399358a92: Pushed 186d8b3e4fd8: Pushed fa6154170bf5: Pushed 633427c64a24: Pushed 891e1e4ef82a: Mounted from kubevirt/virt-controller ff60b29a54a3: Pushed 5eefb9960a36: Pushed devel: digest: sha256:c876f667c1a58243df58b1407ed59e069c26717475622e5ab06adc05637cf13f size: 2620 The push refers to a repository [localhost:33139/kubevirt/virt-handler] 6f9e1ce7ff46: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-launcher 6f9e1ce7ff46: Pushed devel: digest: sha256:33b2db2f5ed24de05a252bb3985106986771631ab954c069c283fde05346beb1 size: 741 The push refers to a repository [localhost:33139/kubevirt/virt-api] 3d6994e10479: Preparing 82fc744c99b4: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-handler 82fc744c99b4: Pushed 3d6994e10479: Pushed devel: digest: sha256:1f36d8b5caec54245f00273318688b179975f60c030da2080d3bafea482c3b24 size: 948 The push refers to a repository [localhost:33139/kubevirt/disks-images-provider] 71ad31feb2c5: Preparing 21d4b721776e: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-api 71ad31feb2c5: Pushed 21d4b721776e: Pushed devel: digest: sha256:8248c33d4f2cd30ad33251df9173b3ecad245afebd777a5171ab2e204d28df4a size: 948 The push refers to a repository [localhost:33139/kubevirt/vm-killer] c4cfadeeaf5f: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/disks-images-provider c4cfadeeaf5f: Pushed devel: digest: sha256:47714e82b2e2d1b6dc3e1e584d4a04373fb18b38d97dac6b3a7d35ec336a7166 size: 740 The push refers to a repository [localhost:33139/kubevirt/registry-disk-v1alpha] 661cce8d8e52: Preparing 41e0baba3077: Preparing 25edbec0eaea: Preparing 661cce8d8e52: Pushed 41e0baba3077: Pushed 25edbec0eaea: Pushed devel: digest: sha256:b7e540ff190967aaaa59b6d29709634fc580702f074373817a5746502655f2d2 size: 948 The push refers to a repository [localhost:33139/kubevirt/cirros-registry-disk-demo] 861539d118fb: Preparing 661cce8d8e52: Preparing 41e0baba3077: Preparing 25edbec0eaea: Preparing 661cce8d8e52: Waiting 41e0baba3077: Waiting 25edbec0eaea: Waiting 25edbec0eaea: Mounted from kubevirt/registry-disk-v1alpha 41e0baba3077: Mounted from kubevirt/registry-disk-v1alpha 661cce8d8e52: Mounted from kubevirt/registry-disk-v1alpha 861539d118fb: Pushed devel: digest: sha256:2bc0ec7ac7d5b07023e89869d99234aa30109772137ec2bd538ee08ef1b22c4e size: 1160 The push refers to a repository [localhost:33139/kubevirt/fedora-cloud-registry-disk-demo] 3c128f86e56a: Preparing 661cce8d8e52: Preparing 41e0baba3077: Preparing 25edbec0eaea: Preparing 661cce8d8e52: Mounted from kubevirt/cirros-registry-disk-demo 41e0baba3077: Mounted from kubevirt/cirros-registry-disk-demo 25edbec0eaea: Mounted from kubevirt/cirros-registry-disk-demo 3c128f86e56a: Pushed devel: digest: sha256:8f6b51e1dbe7c16a62004d8889773ad4fb893166257d7435ee5e70676642297e size: 1161 The push refers to a repository [localhost:33139/kubevirt/alpine-registry-disk-demo] 9a9e79d66e6a: Preparing 661cce8d8e52: Preparing 41e0baba3077: Preparing 25edbec0eaea: Preparing 41e0baba3077: Mounted from kubevirt/fedora-cloud-registry-disk-demo 25edbec0eaea: Mounted from kubevirt/fedora-cloud-registry-disk-demo 661cce8d8e52: Mounted from kubevirt/fedora-cloud-registry-disk-demo 9a9e79d66e6a: Pushed devel: digest: sha256:380b93b3e6cf2189585f4f3ff9823125aa6af7d4218da5544444489de4c87fd9 size: 1160 The push refers to a repository [localhost:33139/kubevirt/subresource-access-test] c11f7736ebab: Preparing 25cb73590a9d: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/disks-images-provider 25cb73590a9d: Pushed c11f7736ebab: Pushed devel: digest: sha256:9994fb4cad6a8879f7eabe14858faebd7164e0d0a1812b6146f2f06ff9382462 size: 948 The push refers to a repository [localhost:33139/kubevirt/winrmcli] f8083e002d0b: Preparing 53c709abc882: Preparing 9ca98a0f492b: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/subresource-access-test f8083e002d0b: Pushed 9ca98a0f492b: Pushed 53c709abc882: Pushed devel: digest: sha256:2bb0f2a7c6a6c084c1e57bd409bf447d7542882fdcc434f452f3d919561dd272 size: 1165 The push refers to a repository [localhost:33139/kubevirt/example-hook-sidecar] 8f43a7c79eea: Preparing 39bae602f753: Preparing 8f43a7c79eea: Pushed 39bae602f753: Pushed devel: digest: sha256:32a2ccb6e34d8d927b0605ce8f557474440c55ada72b0c1d9bd9a346b4dfb0a9 size: 740 make[1]: Leaving directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt' Done ./cluster/clean.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.11.0-release ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.11.0-release2 ++ job_prefix=kubevirt-functional-tests-k8s-1.11.0-release2 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-151-gdc3f4c7 ++ KUBEVIRT_VERSION=v0.7.0-151-gdc3f4c7 + source cluster/k8s-1.10.3/provider.sh ++ set -e ++ image=k8s-1.10.3@sha256:d6290260e7e6b84419984f12719cf592ccbe327373b8df76aa0481f8ec01d357 ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ source hack/config-default.sh source hack/config-k8s-1.10.3.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.10.3.sh ++ source hack/config-provider-k8s-1.10.3.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.3/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.3/.kubectl +++ docker_prefix=localhost:33139/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Cleaning up ...' Cleaning up ... + cluster/kubectl.sh get vmis --all-namespaces -o=custom-columns=NAME:.metadata.name,NAMESPACE:.metadata.namespace,FINALIZERS:.metadata.finalizers --no-headers + grep foregroundDeleteVirtualMachine + read p error: the server doesn't have a resource type "vmis" + _kubectl delete ds -l kubevirt.io -n kube-system --cascade=false --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=libvirt --force --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=virt-handler --force --grace-period 0 No resources found + namespaces=(default ${namespace}) + for i in '${namespaces[@]}' + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete deployment -l kubevirt.io No resources found + _kubectl -n default delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete rs -l kubevirt.io No resources found + _kubectl -n default delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete services -l kubevirt.io No resources found + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n default delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete secrets -l kubevirt.io No resources found + _kubectl -n default delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete pv -l kubevirt.io No resources found + _kubectl -n default delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete pvc -l kubevirt.io No resources found + _kubectl -n default delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete ds -l kubevirt.io No resources found + _kubectl -n default delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n default delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete pods -l kubevirt.io No resources found + _kubectl -n default delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n default delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete rolebinding -l kubevirt.io No resources found + _kubectl -n default delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete roles -l kubevirt.io No resources found + _kubectl -n default delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete clusterroles -l kubevirt.io No resources found + _kubectl -n default delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n default get crd offlinevirtualmachines.kubevirt.io ++ wc -l ++ export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig ++ cluster/k8s-1.10.3/.kubectl -n default get crd offlinevirtualmachines.kubevirt.io Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + for i in '${namespaces[@]}' + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete deployment -l kubevirt.io No resources found + _kubectl -n kube-system delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete rs -l kubevirt.io No resources found + _kubectl -n kube-system delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete services -l kubevirt.io No resources found + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n kube-system delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete secrets -l kubevirt.io No resources found + _kubectl -n kube-system delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete pv -l kubevirt.io No resources found + _kubectl -n kube-system delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete pvc -l kubevirt.io No resources found + _kubectl -n kube-system delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete ds -l kubevirt.io No resources found + _kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n kube-system delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete pods -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete rolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete roles -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete clusterroles -l kubevirt.io No resources found + _kubectl -n kube-system delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io ++ export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig ++ wc -l ++ KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig ++ cluster/k8s-1.10.3/.kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + sleep 2 + echo Done Done ./cluster/deploy.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.11.0-release ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.11.0-release2 ++ job_prefix=kubevirt-functional-tests-k8s-1.11.0-release2 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-151-gdc3f4c7 ++ KUBEVIRT_VERSION=v0.7.0-151-gdc3f4c7 + source cluster/k8s-1.10.3/provider.sh ++ set -e ++ image=k8s-1.10.3@sha256:d6290260e7e6b84419984f12719cf592ccbe327373b8df76aa0481f8ec01d357 ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ source hack/config-default.sh source hack/config-k8s-1.10.3.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.10.3.sh ++ source hack/config-provider-k8s-1.10.3.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.3/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.3/.kubectl +++ docker_prefix=localhost:33139/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Deploying ...' Deploying ... + [[ -z k8s-1.11.0-release ]] + [[ k8s-1.11.0-release =~ .*-dev ]] + [[ k8s-1.11.0-release =~ .*-release ]] + for manifest in '${MANIFESTS_OUT_DIR}/release/*' + [[ /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/demo-content.yaml =~ .*demo.* ]] + continue + for manifest in '${MANIFESTS_OUT_DIR}/release/*' + [[ /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml =~ .*demo.* ]] + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml clusterrole.rbac.authorization.k8s.io "kubevirt.io:admin" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:edit" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:view" created serviceaccount "kubevirt-apiserver" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-apiserver-auth-delegator" created rolebinding.rbac.authorization.k8s.io "kubevirt-apiserver" created role.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrole.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrole.rbac.authorization.k8s.io "kubevirt-controller" created serviceaccount "kubevirt-controller" created serviceaccount "kubevirt-privileged" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-controller" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-controller-cluster-admin" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-privileged-cluster-admin" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:default" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt.io:default" created service "virt-api" created deployment.extensions "virt-api" created deployment.extensions "virt-controller" created daemonset.extensions "virt-handler" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstances.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstancereplicasets.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstancepresets.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachines.kubevirt.io" created + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R persistentvolumeclaim "disk-alpine" created persistentvolume "host-path-disk-alpine" created persistentvolumeclaim "disk-custom" created persistentvolume "host-path-disk-custom" created daemonset.extensions "disks-images-provider" created serviceaccount "kubevirt-testing" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-testing-cluster-admin" created + [[ k8s-1.10.3 =~ os-* ]] + echo Done Done + namespaces=(kube-system default) + [[ kube-system != \k\u\b\e\-\s\y\s\t\e\m ]] + timeout=300 + sample=30 + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n kube-system --no-headers ++ cluster/kubectl.sh get pods -n kube-system --no-headers ++ grep -v Running + '[' -n 'virt-api-7d79764579-frt4f 0/1 ContainerCreating 0 5s virt-api-7d79764579-hmhf7 0/1 ContainerCreating 0 4s virt-controller-7d57d96b65-d9qsl 0/1 ContainerCreating 0 4s virt-controller-7d57d96b65-rs8jn 0/1 ContainerCreating 0 4s virt-handler-k84xh 0/1 ContainerCreating 0 5s virt-handler-zktsz 0/1 ContainerCreating 0 5s' ']' + echo 'Waiting for kubevirt pods to enter the Running state ...' Waiting for kubevirt pods to enter the Running state ... + kubectl get pods -n kube-system --no-headers + cluster/kubectl.sh get pods -n kube-system --no-headers + grep -v Running disks-images-provider-9tn5b 0/1 ContainerCreating 0 2s disks-images-provider-tqbmm 0/1 ContainerCreating 0 1s virt-api-7d79764579-frt4f 0/1 ContainerCreating 0 6s virt-api-7d79764579-hmhf7 0/1 ContainerCreating 0 5s virt-controller-7d57d96b65-d9qsl 0/1 ContainerCreating 0 5s virt-controller-7d57d96b65-rs8jn 0/1 ContainerCreating 0 5s virt-handler-k84xh 0/1 ContainerCreating 0 6s virt-handler-zktsz 0/1 ContainerCreating 0 6s + sleep 30 + current_time=30 + '[' 30 -gt 300 ']' ++ kubectl get pods -n kube-system --no-headers ++ cluster/kubectl.sh get pods -n kube-system --no-headers ++ grep -v Running + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n '' ']' + kubectl get pods -n kube-system + cluster/kubectl.sh get pods -n kube-system NAME READY STATUS RESTARTS AGE disks-images-provider-9tn5b 1/1 Running 0 43s disks-images-provider-tqbmm 1/1 Running 0 42s etcd-node01 1/1 Running 0 13m kube-apiserver-node01 1/1 Running 0 13m kube-controller-manager-node01 1/1 Running 0 13m kube-dns-86f4d74b45-k8nmp 3/3 Running 0 14m kube-flannel-ds-kwthh 1/1 Running 0 14m kube-flannel-ds-zxrfs 1/1 Running 0 13m kube-proxy-fmg48 1/1 Running 0 13m kube-proxy-p4krg 1/1 Running 0 14m kube-scheduler-node01 1/1 Running 0 13m virt-api-7d79764579-frt4f 1/1 Running 0 47s virt-api-7d79764579-hmhf7 1/1 Running 0 46s virt-controller-7d57d96b65-d9qsl 1/1 Running 0 46s virt-controller-7d57d96b65-rs8jn 1/1 Running 0 46s virt-handler-k84xh 1/1 Running 0 47s virt-handler-zktsz 1/1 Running 0 47s + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n default --no-headers ++ cluster/kubectl.sh get pods -n default --no-headers ++ grep -v Running No resources found. + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n '' ']' + kubectl get pods -n default + cluster/kubectl.sh get pods -n default No resources found. + kubectl version + cluster/kubectl.sh version Client Version: version.Info{Major:"1", Minor:"10", GitVersion:"v1.10.3", GitCommit:"2bba0127d85d5a46ab4b778548be28623b32d0b0", GitTreeState:"clean", BuildDate:"2018-05-21T09:17:39Z", GoVersion:"go1.9.3", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"10", GitVersion:"v1.10.3", GitCommit:"2bba0127d85d5a46ab4b778548be28623b32d0b0", GitTreeState:"clean", BuildDate:"2018-05-21T09:05:37Z", GoVersion:"go1.9.3", Compiler:"gc", Platform:"linux/amd64"} + ginko_params='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/junit.xml' + [[ k8s-1.11.0-release =~ windows.* ]] + FUNC_TEST_ARGS='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/junit.xml' + make functest hack/dockerized "hack/build-func-tests.sh" sha256:dcf2b21fa2ed11dcf9dbba21b1cca0ee3fad521a0e9aee61c06d0b0b66a4b200 go version go1.10 linux/amd64 go version go1.10 linux/amd64 Compiling tests... compiled tests.test hack/functests.sh Running Suite: Tests Suite ========================== Random Seed: 1532942085 Will run 149 of 149 specs • ------------------------------ • [SLOW TEST:48.899 seconds] CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 with cloudInitNoCloud userDataBase64 source /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:81 should have cloud-init data /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:82 ------------------------------ 2018/07/30 05:15:35 read closing down: EOF • [SLOW TEST:112.655 seconds] 2018/07/30 05:17:28 read closing down: EOF CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 with cloudInitNoCloud userDataBase64 source /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:81 with injected ssh-key /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:92 should have ssh-key under authorized keys /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:93 ------------------------------ 2018/07/30 05:18:18 read closing down: EOF • [SLOW TEST:60.317 seconds] 2018/07/30 05:18:28 read closing down: EOF CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 with cloudInitNoCloud userData source /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:118 should process provided cloud-init data /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:119 ------------------------------ 2018/07/30 05:19:13 read closing down: EOF • [SLOW TEST:44.598 seconds] CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 should take user-data from k8s secret /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:162 ------------------------------ • Pod name: disks-images-provider-9tn5b Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-tqbmm Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-frt4f Pod phase: Running level=error timestamp=2018-07-30T09:19:12.668015Z pos=subresource.go:91 component=virt-api reason="tls: use of closed connection" msg="error ecountered reading from remote podExec stream" 2018/07/30 09:19:14 http: TLS handshake error from 10.244.1.1:40724: EOF 2018/07/30 09:19:24 http: TLS handshake error from 10.244.1.1:40730: EOF 2018/07/30 09:19:34 http: TLS handshake error from 10.244.1.1:40736: EOF 2018/07/30 09:19:44 http: TLS handshake error from 10.244.1.1:40742: EOF 2018/07/30 09:19:54 http: TLS handshake error from 10.244.1.1:40748: EOF 2018/07/30 09:20:04 http: TLS handshake error from 10.244.1.1:40754: EOF 2018/07/30 09:20:14 http: TLS handshake error from 10.244.1.1:40760: EOF 2018/07/30 09:20:24 http: TLS handshake error from 10.244.1.1:40766: EOF 2018/07/30 09:20:34 http: TLS handshake error from 10.244.1.1:40772: EOF 2018/07/30 09:20:44 http: TLS handshake error from 10.244.1.1:40778: EOF 2018/07/30 09:20:54 http: TLS handshake error from 10.244.1.1:40784: EOF 2018/07/30 09:21:04 http: TLS handshake error from 10.244.1.1:40790: EOF 2018/07/30 09:21:14 http: TLS handshake error from 10.244.1.1:40796: EOF 2018/07/30 09:21:24 http: TLS handshake error from 10.244.1.1:40802: EOF Pod name: virt-api-7d79764579-hmhf7 Pod phase: Running 2018/07/30 09:20:13 http: TLS handshake error from 10.244.0.1:47782: EOF level=info timestamp=2018-07-30T09:20:15.850197Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 09:20:23 http: TLS handshake error from 10.244.0.1:47806: EOF 2018/07/30 09:20:33 http: TLS handshake error from 10.244.0.1:47830: EOF level=info timestamp=2018-07-30T09:20:41.023237Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T09:20:42.101966Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 09:20:43 http: TLS handshake error from 10.244.0.1:47854: EOF level=info timestamp=2018-07-30T09:20:45.799442Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 09:20:53 http: TLS handshake error from 10.244.0.1:47878: EOF 2018/07/30 09:21:03 http: TLS handshake error from 10.244.0.1:47902: EOF level=info timestamp=2018-07-30T09:21:11.137648Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T09:21:12.214662Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 09:21:13 http: TLS handshake error from 10.244.0.1:47926: EOF level=info timestamp=2018-07-30T09:21:15.835621Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 09:21:23 http: TLS handshake error from 10.244.0.1:47950: EOF Pod name: virt-controller-7d57d96b65-d9qsl Pod phase: Running level=info timestamp=2018-07-30T09:10:40.160963Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-rs8jn Pod phase: Running level=info timestamp=2018-07-30T09:15:35.473215Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqhvqg kind= uid=1df4fe91-93d9-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T09:15:35.474386Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqhvqg kind= uid=1df4fe91-93d9-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T09:17:28.138050Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7hjg4 kind= uid=6119401d-93d9-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T09:17:28.138586Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7hjg4 kind= uid=6119401d-93d9-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T09:18:28.703425Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8dlwx kind= uid=8516a5e1-93d9-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T09:18:28.713148Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8dlwx kind= uid=8516a5e1-93d9-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T09:18:28.802303Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8dlwx\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8dlwx" level=info timestamp=2018-07-30T09:19:13.042444Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwfdvt kind= uid=9fa2ce69-93d9-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T09:19:13.043234Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwfdvt kind= uid=9fa2ce69-93d9-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T09:19:13.224938Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwfdvt\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwfdvt" level=info timestamp=2018-07-30T09:19:13.240324Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwfdvt\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwfdvt" level=info timestamp=2018-07-30T09:19:13.280187Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwfdvt\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmiwfdvt, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 9fa2ce69-93d9-11e8-97bf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwfdvt" level=info timestamp=2018-07-30T09:19:13.296805Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwfdvt\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmiwfdvt, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 9fa2ce69-93d9-11e8-97bf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwfdvt" level=info timestamp=2018-07-30T09:19:13.400615Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitkk8j kind= uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T09:19:13.401101Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitkk8j kind= uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-k84xh Pod phase: Running level=info timestamp=2018-07-30T09:19:30.098057Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmitkk8j kind= uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:19:31.209872Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type ADDED" level=info timestamp=2018-07-30T09:19:31.212077Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmitkk8j kind=Domain uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-30T09:19:31.821608Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmitkk8j kind= uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:19:31.832565Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmitkk8j kind= uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T09:19:31.856359Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T09:19:31.858708Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmitkk8j kind=Domain uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-30T09:19:31.879660Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T09:19:31.924106Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmitkk8j kind= uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:19:31.924268Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmitkk8j kind= uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:19:31.931879Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmitkk8j kind= uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:19:31.965331Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmitkk8j kind= uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T09:19:31.965429Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmitkk8j kind= uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T09:19:31.966301Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmitkk8j kind= uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmitkk8j" level=info timestamp=2018-07-30T09:19:32.024442Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmitkk8j kind= uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-zktsz Pod phase: Running level=info timestamp=2018-07-30T09:18:46.918854Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T09:18:46.919117Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmi8dlwx kind=Domain uid=8516a5e1-93d9-11e8-97bf-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-30T09:18:46.954082Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T09:18:46.957242Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi8dlwx kind= uid=8516a5e1-93d9-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:18:46.957350Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmi8dlwx kind= uid=8516a5e1-93d9-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T09:18:46.988977Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi8dlwx kind= uid=8516a5e1-93d9-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:18:46.989115Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmi8dlwx kind= uid=8516a5e1-93d9-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:18:47.006298Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi8dlwx kind= uid=8516a5e1-93d9-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:19:12.701275Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmi8dlwx kind= uid=8516a5e1-93d9-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T09:19:12.702706Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi8dlwx kind= uid=8516a5e1-93d9-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T09:19:12.714216Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmi8dlwx kind= uid=8516a5e1-93d9-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmi8dlwx" level=info timestamp=2018-07-30T09:19:12.881879Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi8dlwx kind= uid=8516a5e1-93d9-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:19:12.882492Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmi8dlwx kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:19:12.882802Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi8dlwx kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:19:12.884309Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi8dlwx kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." ------------------------------ • Failure [139.366 seconds] RegistryDisk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:41 Starting and stopping the same VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:90 with ephemeral registry disk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:91 should success multiple times [It] /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:92 Timed out after 120.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1065 ------------------------------ STEP: Starting the VirtualMachineInstance level=info timestamp=2018-07-30T09:19:14.013636Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmitkk8j kind=VirtualMachineInstance uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmitkk8j-b9wzm" level=info timestamp=2018-07-30T09:19:30.540383Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmitkk8j kind=VirtualMachineInstance uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmitkk8j-b9wzm" level=info timestamp=2018-07-30T09:19:32.372733Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmitkk8j kind=VirtualMachineInstance uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-30T09:19:32.394363Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmitkk8j kind=VirtualMachineInstance uid=9fd9208e-93d9-11e8-97bf-525500d15501 msg="VirtualMachineInstance started." STEP: Stopping the VirtualMachineInstance STEP: Waiting until the VirtualMachineInstance is gone • [SLOW TEST:16.854 seconds] RegistryDisk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:41 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:111 with ephemeral registry disk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:112 should not modify the spec on status update /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:113 ------------------------------ • [SLOW TEST:27.741 seconds] RegistryDisk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:41 Starting multiple VMIs /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:129 with ephemeral registry disk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:130 should success /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:131 ------------------------------ volumedisk0 compute • [SLOW TEST:40.810 seconds] Configurations 2018/07/30 05:22:58 read closing down: EOF /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 VirtualMachineInstance definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:55 with 3 CPU cores /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:56 should report 3 cpu cores under guest OS /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:62 ------------------------------ • ------------------------------ • [SLOW TEST:18.798 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 VirtualMachineInstance definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:55 with hugepages /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:164 should consume hugepages /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 hugepages-2Mi /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ S [SKIPPING] [0.330 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 VirtualMachineInstance definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:55 with hugepages /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:164 should consume hugepages /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 hugepages-1Gi [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 No node with hugepages hugepages-1Gi capacity /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:216 ------------------------------ •2018/07/30 05:25:01 read closing down: EOF ------------------------------ • [SLOW TEST:101.591 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 with CPU spec /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:294 when CPU model defined /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:340 should report defined CPU model /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:341 ------------------------------ 2018/07/30 05:26:42 read closing down: EOF • [SLOW TEST:101.075 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 with CPU spec /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:294 when CPU model equals to passthrough /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:368 should report exactly the same model as node CPU /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:369 ------------------------------ • [SLOW TEST:103.718 seconds] 2018/07/30 05:28:26 read closing down: EOF Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 with CPU spec /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:294 when CPU model not defined /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:392 should report CPU model from libvirt capabilities /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:393 ------------------------------ • [SLOW TEST:46.926 seconds] Configurations 2018/07/30 05:29:13 read closing down: EOF /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 New VirtualMachineInstance with all supported drives /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:413 should have all the device nodes /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:436 ------------------------------ •• ------------------------------ • [SLOW TEST:18.431 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should update VirtualMachine once VMIs are up /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:195 ------------------------------ • [SLOW TEST:6.590 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should remove VirtualMachineInstance once the VMI is marked for deletion /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:204 ------------------------------ • Pod name: disks-images-provider-9tn5b Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-tqbmm Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-frt4f Pod phase: Running 2018/07/30 09:31:34 http: TLS handshake error from 10.244.1.1:41170: EOF 2018/07/30 09:31:44 http: TLS handshake error from 10.244.1.1:41176: EOF 2018/07/30 09:31:54 http: TLS handshake error from 10.244.1.1:41182: EOF 2018/07/30 09:32:04 http: TLS handshake error from 10.244.1.1:41188: EOF 2018/07/30 09:32:14 http: TLS handshake error from 10.244.1.1:41194: EOF 2018/07/30 09:32:24 http: TLS handshake error from 10.244.1.1:41200: EOF 2018/07/30 09:32:34 http: TLS handshake error from 10.244.1.1:41206: EOF 2018/07/30 09:32:44 http: TLS handshake error from 10.244.1.1:41212: EOF 2018/07/30 09:32:54 http: TLS handshake error from 10.244.1.1:41218: EOF 2018/07/30 09:33:04 http: TLS handshake error from 10.244.1.1:41224: EOF 2018/07/30 09:33:14 http: TLS handshake error from 10.244.1.1:41230: EOF 2018/07/30 09:33:24 http: TLS handshake error from 10.244.1.1:41236: EOF 2018/07/30 09:33:34 http: TLS handshake error from 10.244.1.1:41242: EOF 2018/07/30 09:33:44 http: TLS handshake error from 10.244.1.1:41248: EOF 2018/07/30 09:33:54 http: TLS handshake error from 10.244.1.1:41254: EOF Pod name: virt-api-7d79764579-hmhf7 Pod phase: Running level=info timestamp=2018-07-30T09:32:52.662372Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T09:32:52.667810Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 09:32:53 http: TLS handshake error from 10.244.0.1:49680: EOF 2018/07/30 09:33:03 http: TLS handshake error from 10.244.0.1:49704: EOF 2018/07/30 09:33:13 http: TLS handshake error from 10.244.0.1:49728: EOF level=info timestamp=2018-07-30T09:33:15.919157Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T09:33:16.545141Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T09:33:16.740579Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 09:33:23 http: TLS handshake error from 10.244.0.1:49752: EOF 2018/07/30 09:33:33 http: TLS handshake error from 10.244.0.1:49776: EOF 2018/07/30 09:33:43 http: TLS handshake error from 10.244.0.1:49800: EOF level=info timestamp=2018-07-30T09:33:45.891155Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T09:33:46.663161Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T09:33:46.853129Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 09:33:53 http: TLS handshake error from 10.244.0.1:49824: EOF Pod name: virt-controller-7d57d96b65-d9qsl Pod phase: Running level=info timestamp=2018-07-30T09:10:40.160963Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-rs8jn Pod phase: Running level=info timestamp=2018-07-30T09:29:41.387416Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijc8pq kind= uid=161f07de-93db-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:29:41.393503Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijc8pq kind= uid=161f07de-93db-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:29:41.393650Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijc8pq kind= uid=161f07de-93db-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:29:41.525517Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijc8pq kind= uid=161f07de-93db-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:29:41.535363Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijc8pq kind= uid=161f07de-93db-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:29:41.544940Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijc8pq kind= uid=161f07de-93db-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:29:41.545020Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijc8pq kind= uid=161f07de-93db-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:29:57.927925Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijc8pq kind= uid=161f07de-93db-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:29:57.928269Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijc8pq kind= uid=161f07de-93db-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:29:59.771071Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijc8pq kind= uid=161f07de-93db-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:29:59.771301Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijc8pq kind= uid=161f07de-93db-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:29:59.810341Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijc8pq kind= uid=161f07de-93db-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:29:59.810566Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijc8pq kind= uid=161f07de-93db-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:30:00.829960Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijc8pq kind= uid=161f07de-93db-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:30:00.830196Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijc8pq kind= uid=161f07de-93db-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-handler-k84xh Pod phase: Running level=info timestamp=2018-07-30T09:28:26.427932Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind= uid=c4c6f604-93da-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.612071Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T09:28:26.613245Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.617260Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmizcwhr" level=info timestamp=2018-07-30T09:28:26.694095Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind= uid=abf5b3e9-93da-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.889260Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:28:26.889356Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.891624Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.933666Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:28:26.933822Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.934243Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.934629Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.968286Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:28:26.968359Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.968542Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-zktsz Pod phase: Running level=info timestamp=2018-07-30T09:29:57.815149Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmijc8pq kind= uid=1624b095-93db-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:29:59.006281Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type ADDED" level=info timestamp=2018-07-30T09:29:59.007822Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmijc8pq kind=Domain uid=1624b095-93db-11e8-97bf-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-30T09:29:59.620197Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T09:29:59.620411Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmijc8pq kind=Domain uid=1624b095-93db-11e8-97bf-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-30T09:29:59.644478Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T09:29:59.647161Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmijc8pq kind= uid=1624b095-93db-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:29:59.647273Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmijc8pq kind= uid=1624b095-93db-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T09:29:59.703993Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmijc8pq kind= uid=1624b095-93db-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:29:59.704187Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmijc8pq kind= uid=1624b095-93db-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:29:59.710926Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmijc8pq kind= uid=1624b095-93db-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:30:00.743903Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmijc8pq kind= uid=1624b095-93db-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T09:30:00.744177Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmijc8pq kind= uid=1624b095-93db-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T09:30:00.745248Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmijc8pq kind= uid=1624b095-93db-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmijc8pq" level=info timestamp=2018-07-30T09:30:00.810142Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmijc8pq kind= uid=1624b095-93db-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." ------------------------------ • Failure [260.139 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should recreate VirtualMachineInstance if it gets deleted [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:245 Timed out after 240.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:262 ------------------------------ STEP: Starting the VirtualMachineInstance STEP: VMI has the running condition • [SLOW TEST:56.096 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should recreate VirtualMachineInstance if the VirtualMachineInstance's pod gets deleted /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:265 ------------------------------ Pod name: disks-images-provider-9tn5b Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-tqbmm Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-frt4f Pod phase: Running 2018/07/30 09:37:54 http: TLS handshake error from 10.244.1.1:41398: EOF 2018/07/30 09:38:04 http: TLS handshake error from 10.244.1.1:41404: EOF 2018/07/30 09:38:14 http: TLS handshake error from 10.244.1.1:41410: EOF 2018/07/30 09:38:24 http: TLS handshake error from 10.244.1.1:41416: EOF 2018/07/30 09:38:34 http: TLS handshake error from 10.244.1.1:41422: EOF 2018/07/30 09:38:44 http: TLS handshake error from 10.244.1.1:41428: EOF 2018/07/30 09:38:54 http: TLS handshake error from 10.244.1.1:41434: EOF 2018/07/30 09:39:04 http: TLS handshake error from 10.244.1.1:41440: EOF 2018/07/30 09:39:14 http: TLS handshake error from 10.244.1.1:41446: EOF 2018/07/30 09:39:24 http: TLS handshake error from 10.244.1.1:41452: EOF 2018/07/30 09:39:34 http: TLS handshake error from 10.244.1.1:41458: EOF 2018/07/30 09:39:44 http: TLS handshake error from 10.244.1.1:41464: EOF 2018/07/30 09:39:54 http: TLS handshake error from 10.244.1.1:41470: EOF 2018/07/30 09:40:04 http: TLS handshake error from 10.244.1.1:41476: EOF 2018/07/30 09:40:14 http: TLS handshake error from 10.244.1.1:41482: EOF Pod name: virt-api-7d79764579-hmhf7 Pod phase: Running level=info timestamp=2018-07-30T09:39:15.923964Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T09:39:17.991572Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T09:39:18.264832Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 09:39:23 http: TLS handshake error from 10.244.0.1:50616: EOF 2018/07/30 09:39:33 http: TLS handshake error from 10.244.0.1:50640: EOF 2018/07/30 09:39:43 http: TLS handshake error from 10.244.0.1:50664: EOF level=info timestamp=2018-07-30T09:39:45.861632Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T09:39:48.125554Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T09:39:48.385924Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T09:39:52.835530Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T09:39:52.848872Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 09:39:53 http: TLS handshake error from 10.244.0.1:50688: EOF 2018/07/30 09:40:03 http: TLS handshake error from 10.244.0.1:50712: EOF 2018/07/30 09:40:13 http: TLS handshake error from 10.244.0.1:50736: EOF level=info timestamp=2018-07-30T09:40:15.863189Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-d9qsl Pod phase: Running level=info timestamp=2018-07-30T09:10:40.160963Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-rs8jn Pod phase: Running level=info timestamp=2018-07-30T09:34:57.889566Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2a84691-93db-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:34:57.895185Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6tkng\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6tkng" level=info timestamp=2018-07-30T09:34:57.920586Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2a84691-93db-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:34:57.922358Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2a84691-93db-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:35:14.563593Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2a84691-93db-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:35:14.563947Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2a84691-93db-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:35:16.358333Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2a84691-93db-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:35:16.358963Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2a84691-93db-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:35:16.382570Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2a84691-93db-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:35:16.383319Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2a84691-93db-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:35:17.038910Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2a84691-93db-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:35:17.039163Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2a84691-93db-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: false" level=info timestamp=2018-07-30T09:35:17.071494Z pos=vm.go:322 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2a84691-93db-11e8-97bf-525500d15501 msg="Dispatching delete event" level=info timestamp=2018-07-30T09:35:17.074318Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2a84691-93db-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:35:17.074501Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2a84691-93db-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: false" Pod name: virt-handler-k84xh Pod phase: Running level=info timestamp=2018-07-30T09:28:26.427932Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind= uid=c4c6f604-93da-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.612071Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T09:28:26.613245Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.617260Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmizcwhr" level=info timestamp=2018-07-30T09:28:26.694095Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind= uid=abf5b3e9-93da-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.889260Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:28:26.889356Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.891624Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.933666Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:28:26.933822Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.934243Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.934629Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.968286Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:28:26.968359Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.968542Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-zktsz Pod phase: Running level=info timestamp=2018-07-30T09:35:15.809143Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmi6tkng kind=Domain uid=d2b14c56-93db-11e8-97bf-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-30T09:35:16.207450Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T09:35:16.209299Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmi6tkng kind=Domain uid=d2b14c56-93db-11e8-97bf-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-30T09:35:16.226782Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2b14c56-93db-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:35:16.227083Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2b14c56-93db-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T09:35:16.279381Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2b14c56-93db-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:35:16.279673Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2b14c56-93db-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:35:16.303241Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2b14c56-93db-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:35:16.307795Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T09:35:16.308353Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2b14c56-93db-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:35:16.325015Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2b14c56-93db-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:35:16.999864Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2b14c56-93db-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T09:35:17.000333Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2b14c56-93db-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T09:35:17.002921Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2b14c56-93db-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmi6tkng" level=info timestamp=2018-07-30T09:35:17.070198Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6tkng kind= uid=d2b14c56-93db-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." • Failure [320.212 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should stop VirtualMachineInstance if running set to false [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:325 Timed out after 300.000s. The vmi did not disappear Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:183 ------------------------------ STEP: Starting the VirtualMachineInstance STEP: VMI has the running condition STEP: Stopping the VirtualMachineInstance Pod name: disks-images-provider-9tn5b Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-tqbmm Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-frt4f Pod phase: Running 2018/07/30 09:43:14 http: TLS handshake error from 10.244.1.1:41590: EOF 2018/07/30 09:43:24 http: TLS handshake error from 10.244.1.1:41596: EOF 2018/07/30 09:43:34 http: TLS handshake error from 10.244.1.1:41602: EOF 2018/07/30 09:43:44 http: TLS handshake error from 10.244.1.1:41608: EOF 2018/07/30 09:43:54 http: TLS handshake error from 10.244.1.1:41614: EOF 2018/07/30 09:44:04 http: TLS handshake error from 10.244.1.1:41620: EOF 2018/07/30 09:44:14 http: TLS handshake error from 10.244.1.1:41626: EOF 2018/07/30 09:44:24 http: TLS handshake error from 10.244.1.1:41632: EOF 2018/07/30 09:44:34 http: TLS handshake error from 10.244.1.1:41638: EOF 2018/07/30 09:44:44 http: TLS handshake error from 10.244.1.1:41644: EOF 2018/07/30 09:44:54 http: TLS handshake error from 10.244.1.1:41650: EOF 2018/07/30 09:45:04 http: TLS handshake error from 10.244.1.1:41656: EOF 2018/07/30 09:45:14 http: TLS handshake error from 10.244.1.1:41662: EOF 2018/07/30 09:45:24 http: TLS handshake error from 10.244.1.1:41668: EOF 2018/07/30 09:45:34 http: TLS handshake error from 10.244.1.1:41674: EOF Pod name: virt-api-7d79764579-hmhf7 Pod phase: Running 2018/07/30 09:44:33 http: TLS handshake error from 10.244.0.1:51360: EOF 2018/07/30 09:44:43 http: TLS handshake error from 10.244.0.1:51384: EOF level=info timestamp=2018-07-30T09:44:45.890340Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T09:44:49.617936Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T09:44:49.628498Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 09:44:53 http: TLS handshake error from 10.244.0.1:51408: EOF level=info timestamp=2018-07-30T09:44:54.085160Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T09:44:54.089288Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 09:45:03 http: TLS handshake error from 10.244.0.1:51432: EOF 2018/07/30 09:45:13 http: TLS handshake error from 10.244.0.1:51456: EOF level=info timestamp=2018-07-30T09:45:15.864678Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T09:45:19.769364Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T09:45:19.779516Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 09:45:23 http: TLS handshake error from 10.244.0.1:51480: EOF 2018/07/30 09:45:33 http: TLS handshake error from 10.244.0.1:51504: EOF Pod name: virt-controller-7d57d96b65-d9qsl Pod phase: Running level=info timestamp=2018-07-30T09:10:40.160963Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-rs8jn Pod phase: Running level=info timestamp=2018-07-30T09:40:18.049429Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9175abe8-93dc-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:40:18.079088Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9175abe8-93dc-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:40:18.140600Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9175abe8-93dc-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:40:18.141970Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9175abe8-93dc-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:40:34.464576Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9175abe8-93dc-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:40:34.466661Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9175abe8-93dc-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:40:35.788992Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9175abe8-93dc-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:40:35.789311Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9175abe8-93dc-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:40:35.818400Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9175abe8-93dc-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:40:35.818603Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9175abe8-93dc-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:40:36.049481Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9175abe8-93dc-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:40:36.049606Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9175abe8-93dc-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: false" level=info timestamp=2018-07-30T09:40:36.076500Z pos=vm.go:322 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9175abe8-93dc-11e8-97bf-525500d15501 msg="Dispatching delete event" level=info timestamp=2018-07-30T09:40:36.077671Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9175abe8-93dc-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:40:36.077943Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9175abe8-93dc-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: false" Pod name: virt-handler-k84xh Pod phase: Running level=info timestamp=2018-07-30T09:28:26.427932Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind= uid=c4c6f604-93da-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.612071Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T09:28:26.613245Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.617260Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmizcwhr" level=info timestamp=2018-07-30T09:28:26.694095Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind= uid=abf5b3e9-93da-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.889260Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:28:26.889356Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.891624Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.933666Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:28:26.933822Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.934243Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.934629Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.968286Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:28:26.968359Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.968542Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-zktsz Pod phase: Running level=info timestamp=2018-07-30T09:40:34.391701Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9183eec8-93dc-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:40:35.319110Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type ADDED" level=info timestamp=2018-07-30T09:40:35.322248Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmiqv9p7 kind=Domain uid=9183eec8-93dc-11e8-97bf-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-30T09:40:35.571091Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T09:40:35.571527Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmiqv9p7 kind=Domain uid=9183eec8-93dc-11e8-97bf-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-30T09:40:35.630220Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9183eec8-93dc-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:40:35.630515Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9183eec8-93dc-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T09:40:35.635093Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T09:40:35.714431Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9183eec8-93dc-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:40:35.714815Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9183eec8-93dc-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:40:35.726257Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9183eec8-93dc-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:40:36.000093Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9183eec8-93dc-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T09:40:36.000188Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9183eec8-93dc-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T09:40:36.000652Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9183eec8-93dc-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmiqv9p7" level=info timestamp=2018-07-30T09:40:36.069243Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiqv9p7 kind= uid=9183eec8-93dc-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." • Failure [318.928 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should start and stop VirtualMachineInstance multiple times [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:333 Timed out after 300.000s. The vmi did not disappear Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:183 ------------------------------ STEP: Doing run: 0 STEP: Starting the VirtualMachineInstance STEP: VMI has the running condition STEP: Stopping the VirtualMachineInstance Pod name: disks-images-provider-9tn5b Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-tqbmm Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-frt4f Pod phase: Running 2018/07/30 09:48:34 http: TLS handshake error from 10.244.1.1:41782: EOF 2018/07/30 09:48:44 http: TLS handshake error from 10.244.1.1:41788: EOF 2018/07/30 09:48:54 http: TLS handshake error from 10.244.1.1:41794: EOF 2018/07/30 09:49:04 http: TLS handshake error from 10.244.1.1:41800: EOF 2018/07/30 09:49:14 http: TLS handshake error from 10.244.1.1:41806: EOF 2018/07/30 09:49:24 http: TLS handshake error from 10.244.1.1:41812: EOF 2018/07/30 09:49:34 http: TLS handshake error from 10.244.1.1:41818: EOF 2018/07/30 09:49:44 http: TLS handshake error from 10.244.1.1:41824: EOF 2018/07/30 09:49:54 http: TLS handshake error from 10.244.1.1:41830: EOF 2018/07/30 09:50:04 http: TLS handshake error from 10.244.1.1:41836: EOF 2018/07/30 09:50:14 http: TLS handshake error from 10.244.1.1:41842: EOF 2018/07/30 09:50:24 http: TLS handshake error from 10.244.1.1:41848: EOF 2018/07/30 09:50:34 http: TLS handshake error from 10.244.1.1:41854: EOF 2018/07/30 09:50:44 http: TLS handshake error from 10.244.1.1:41860: EOF 2018/07/30 09:50:54 http: TLS handshake error from 10.244.1.1:41866: EOF Pod name: virt-api-7d79764579-hmhf7 Pod phase: Running level=info timestamp=2018-07-30T09:49:53.096626Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T09:49:53.100329Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 09:49:53 http: TLS handshake error from 10.244.0.1:52128: EOF 2018/07/30 09:50:03 http: TLS handshake error from 10.244.0.1:52152: EOF 2018/07/30 09:50:13 http: TLS handshake error from 10.244.0.1:52176: EOF level=info timestamp=2018-07-30T09:50:15.940199Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T09:50:21.626207Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T09:50:21.642657Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 09:50:23 http: TLS handshake error from 10.244.0.1:52200: EOF 2018/07/30 09:50:33 http: TLS handshake error from 10.244.0.1:52224: EOF 2018/07/30 09:50:43 http: TLS handshake error from 10.244.0.1:52248: EOF level=info timestamp=2018-07-30T09:50:45.929557Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T09:50:51.815065Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T09:50:51.817154Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 09:50:53 http: TLS handshake error from 10.244.0.1:52272: EOF Pod name: virt-controller-7d57d96b65-d9qsl Pod phase: Running level=info timestamp=2018-07-30T09:10:40.160963Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-rs8jn Pod phase: Running level=info timestamp=2018-07-30T09:45:36.971376Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4f9d8da6-93dd-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:45:36.972452Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9l4nb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9l4nb" level=info timestamp=2018-07-30T09:45:53.026247Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4f9d8da6-93dd-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:45:53.026594Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4f9d8da6-93dd-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:45:54.806483Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4f9d8da6-93dd-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:45:54.807271Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4f9d8da6-93dd-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:45:54.837194Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4f9d8da6-93dd-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:45:54.837432Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4f9d8da6-93dd-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:45:55.050603Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4f9d8da6-93dd-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:45:55.051566Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4f9d8da6-93dd-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:45:55.091767Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4f9d8da6-93dd-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:45:55.091952Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4f9d8da6-93dd-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: false" level=info timestamp=2018-07-30T09:45:55.123368Z pos=vm.go:322 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4f9d8da6-93dd-11e8-97bf-525500d15501 msg="Dispatching delete event" level=info timestamp=2018-07-30T09:45:55.128339Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4f9d8da6-93dd-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:45:55.132327Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4f9d8da6-93dd-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: false" Pod name: virt-handler-k84xh Pod phase: Running level=info timestamp=2018-07-30T09:28:26.427932Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind= uid=c4c6f604-93da-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.612071Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T09:28:26.613245Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.617260Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmizcwhr" level=info timestamp=2018-07-30T09:28:26.694095Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind= uid=abf5b3e9-93da-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.889260Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:28:26.889356Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.891624Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.933666Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:28:26.933822Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.934243Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.934629Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.968286Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:28:26.968359Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.968542Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-zktsz Pod phase: Running level=info timestamp=2018-07-30T09:45:54.710792Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4fa4f31b-93dd-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:45:54.710932Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4fa4f31b-93dd-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T09:45:54.713451Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T09:45:54.736540Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4fa4f31b-93dd-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:45:54.736687Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4fa4f31b-93dd-11e8-97bf-525500d15501 msg="No update processing required" level=error timestamp=2018-07-30T09:45:54.755223Z pos=vm.go:404 component=virt-handler namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4fa4f31b-93dd-11e8-97bf-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9l4nb\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-07-30T09:45:54.755507Z pos=vm.go:251 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9l4nb\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi9l4nb" level=info timestamp=2018-07-30T09:45:54.755834Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4fa4f31b-93dd-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:45:54.783270Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4fa4f31b-93dd-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:45:54.783571Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4fa4f31b-93dd-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:45:54.801386Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4fa4f31b-93dd-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:45:55.046966Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4fa4f31b-93dd-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T09:45:55.047096Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4fa4f31b-93dd-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T09:45:55.047760Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4fa4f31b-93dd-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmi9l4nb" level=info timestamp=2018-07-30T09:45:55.103620Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4fa4f31b-93dd-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." • Failure [319.067 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should not update the VirtualMachineInstance spec if Running [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:346 Timed out after 300.000s. The vmi did not disappear Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:183 ------------------------------ STEP: Updating the VMI template spec STEP: Expecting the old VirtualMachineInstance spec still running STEP: Restarting the VMI STEP: Stopping the VirtualMachineInstance Pod name: disks-images-provider-9tn5b Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-tqbmm Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-frt4f Pod phase: Running 2018/07/30 09:53:14 http: TLS handshake error from 10.244.1.1:41950: EOF 2018/07/30 09:53:24 http: TLS handshake error from 10.244.1.1:41956: EOF 2018/07/30 09:53:34 http: TLS handshake error from 10.244.1.1:41962: EOF 2018/07/30 09:53:44 http: TLS handshake error from 10.244.1.1:41968: EOF 2018/07/30 09:53:54 http: TLS handshake error from 10.244.1.1:41974: EOF 2018/07/30 09:54:04 http: TLS handshake error from 10.244.1.1:41980: EOF 2018/07/30 09:54:14 http: TLS handshake error from 10.244.1.1:41986: EOF 2018/07/30 09:54:24 http: TLS handshake error from 10.244.1.1:41992: EOF 2018/07/30 09:54:34 http: TLS handshake error from 10.244.1.1:41998: EOF 2018/07/30 09:54:44 http: TLS handshake error from 10.244.1.1:42004: EOF 2018/07/30 09:54:54 http: TLS handshake error from 10.244.1.1:42010: EOF 2018/07/30 09:55:04 http: TLS handshake error from 10.244.1.1:42016: EOF 2018/07/30 09:55:14 http: TLS handshake error from 10.244.1.1:42022: EOF 2018/07/30 09:55:24 http: TLS handshake error from 10.244.1.1:42028: EOF 2018/07/30 09:55:34 http: TLS handshake error from 10.244.1.1:42034: EOF Pod name: virt-api-7d79764579-hmhf7 Pod phase: Running 2018/07/30 09:54:33 http: TLS handshake error from 10.244.0.1:52808: EOF 2018/07/30 09:54:43 http: TLS handshake error from 10.244.0.1:52832: EOF level=info timestamp=2018-07-30T09:54:45.935217Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T09:54:53.053659Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T09:54:53.056137Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 09:54:53 http: TLS handshake error from 10.244.0.1:52856: EOF level=info timestamp=2018-07-30T09:54:53.861093Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T09:54:53.863147Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 09:55:03 http: TLS handshake error from 10.244.0.1:52880: EOF 2018/07/30 09:55:13 http: TLS handshake error from 10.244.0.1:52904: EOF level=info timestamp=2018-07-30T09:55:15.906548Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 09:55:23 http: TLS handshake error from 10.244.0.1:52928: EOF level=info timestamp=2018-07-30T09:55:23.278720Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T09:55:23.280614Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 09:55:33 http: TLS handshake error from 10.244.0.1:52952: EOF Pod name: virt-controller-7d57d96b65-d9qsl Pod phase: Running level=info timestamp=2018-07-30T09:10:40.160963Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-rs8jn Pod phase: Running level=info timestamp=2018-07-30T09:50:55.827754Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmiwjr5n\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmiwjr5n" level=info timestamp=2018-07-30T09:50:55.827858Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dbc6dad-93de-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:50:55.827896Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dbc6dad-93de-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:50:55.835556Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dbc6dad-93de-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:50:55.837173Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dbc6dad-93de-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:50:55.858635Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dbc6dad-93de-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:50:55.858761Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dbc6dad-93de-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:50:55.875329Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dbc6dad-93de-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:50:55.875411Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dbc6dad-93de-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:51:11.908613Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dbc6dad-93de-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:51:11.910306Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dbc6dad-93de-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:51:14.080267Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dbc6dad-93de-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:51:14.080492Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dbc6dad-93de-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:51:14.110785Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dbc6dad-93de-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:51:14.110999Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dbc6dad-93de-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-handler-k84xh Pod phase: Running level=info timestamp=2018-07-30T09:28:26.427932Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind= uid=c4c6f604-93da-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.612071Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T09:28:26.613245Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.617260Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmizcwhr" level=info timestamp=2018-07-30T09:28:26.694095Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind= uid=abf5b3e9-93da-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.889260Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:28:26.889356Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.891624Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmid6pqs kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.933666Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:28:26.933822Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.934243Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmih752n kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.934629Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind= uid=b7ecc718-93da-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:28:26.968286Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:28:26.968359Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:28:26.968542Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmizcwhr kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-zktsz Pod phase: Running level=info timestamp=2018-07-30T09:50:55.540074Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9l4nb kind= uid=4fa4f31b-93dd-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:50:55.573290Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmi9l4nb kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:50:55.573684Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi9l4nb kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:50:55.574187Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9l4nb kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:51:11.831323Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dc1da1e-93de-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:51:13.061673Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type ADDED" level=info timestamp=2018-07-30T09:51:13.072182Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmiwjr5n kind=Domain uid=0dc1da1e-93de-11e8-97bf-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-30T09:51:13.887772Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T09:51:13.888308Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmiwjr5n kind=Domain uid=0dc1da1e-93de-11e8-97bf-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-30T09:51:13.961517Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T09:51:13.973630Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dc1da1e-93de-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:51:13.973979Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dc1da1e-93de-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T09:51:14.003686Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dc1da1e-93de-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:51:14.004174Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dc1da1e-93de-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:51:14.013930Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwjr5n kind= uid=0dc1da1e-93de-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmiwjr5n-z8twg Pod phase: Running 2018/07/30 05:55:43 read closing down: EOF • Failure [287.274 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should survive guest shutdown, multiple times [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:387 Timed out after 240.001s. No new VirtualMachineInstance instance showed up Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:429 ------------------------------ STEP: Creating new VMI, not running STEP: Starting the VirtualMachineInstance STEP: VMI has the running condition STEP: Getting the running VirtualMachineInstance STEP: Obtaining the serial console STEP: Guest shutdown STEP: waiting for the controller to replace the shut-down vmi with a new instance VM testvmicjv5m was scheduled to start • [SLOW TEST:17.677 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 Using virtctl interface /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:435 should start a VirtualMachineInstance once /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:436 ------------------------------ VM testvmisldpd was scheduled to stop Pod name: disks-images-provider-9tn5b Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-tqbmm Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-frt4f Pod phase: Running 2018/07/30 09:59:54 http: TLS handshake error from 10.244.1.1:42190: EOF 2018/07/30 10:00:04 http: TLS handshake error from 10.244.1.1:42196: EOF 2018/07/30 10:00:14 http: TLS handshake error from 10.244.1.1:42202: EOF 2018/07/30 10:00:24 http: TLS handshake error from 10.244.1.1:42208: EOF 2018/07/30 10:00:34 http: TLS handshake error from 10.244.1.1:42214: EOF 2018/07/30 10:00:44 http: TLS handshake error from 10.244.1.1:42220: EOF 2018/07/30 10:00:54 http: TLS handshake error from 10.244.1.1:42226: EOF 2018/07/30 10:01:04 http: TLS handshake error from 10.244.1.1:42232: EOF 2018/07/30 10:01:14 http: TLS handshake error from 10.244.1.1:42238: EOF 2018/07/30 10:01:24 http: TLS handshake error from 10.244.1.1:42244: EOF 2018/07/30 10:01:34 http: TLS handshake error from 10.244.1.1:42250: EOF 2018/07/30 10:01:44 http: TLS handshake error from 10.244.1.1:42256: EOF 2018/07/30 10:01:54 http: TLS handshake error from 10.244.1.1:42262: EOF 2018/07/30 10:02:04 http: TLS handshake error from 10.244.1.1:42268: EOF 2018/07/30 10:02:14 http: TLS handshake error from 10.244.1.1:42274: EOF Pod name: virt-api-7d79764579-hmhf7 Pod phase: Running level=info timestamp=2018-07-30T10:01:15.963770Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:01:23 http: TLS handshake error from 10.244.0.1:53792: EOF level=info timestamp=2018-07-30T10:01:24.975265Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:01:24.991410Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:01:33 http: TLS handshake error from 10.244.0.1:53816: EOF 2018/07/30 10:01:43 http: TLS handshake error from 10.244.0.1:53840: EOF level=info timestamp=2018-07-30T10:01:46.101114Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:01:53 http: TLS handshake error from 10.244.0.1:53864: EOF level=info timestamp=2018-07-30T10:01:53.732232Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T10:01:53.737049Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T10:01:55.117648Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:01:55.143651Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:02:03 http: TLS handshake error from 10.244.0.1:53888: EOF 2018/07/30 10:02:13 http: TLS handshake error from 10.244.0.1:53912: EOF level=info timestamp=2018-07-30T10:02:15.902113Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-d9qsl Pod phase: Running level=info timestamp=2018-07-30T09:10:40.160963Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-rs8jn Pod phase: Running level=info timestamp=2018-07-30T09:56:00.850570Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:56:00.854934Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:56:00.918349Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:56:00.920816Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:56:16.859990Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:56:16.864458Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:56:18.829065Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:56:18.829159Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:56:18.852074Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:56:18.852142Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:56:18.975189Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:56:18.975286Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: false" level=info timestamp=2018-07-30T09:56:19.044570Z pos=vm.go:322 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Dispatching delete event" level=info timestamp=2018-07-30T09:56:19.044903Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:56:19.044988Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: false" Pod name: virt-handler-k84xh Pod phase: Running level=info timestamp=2018-07-30T09:56:18.792251Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:56:18.792443Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T09:56:18.799498Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T09:56:18.816582Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:56:18.817191Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="No update processing required" level=error timestamp=2018-07-30T09:56:18.826884Z pos=vm.go:404 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmisldpd\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-07-30T09:56:18.827033Z pos=vm.go:251 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmisldpd\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmisldpd" level=info timestamp=2018-07-30T09:56:18.827165Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:56:18.837282Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:56:18.837429Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:56:18.843474Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:56:19.045756Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T09:56:19.045939Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T09:56:19.047844Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmisldpd" level=info timestamp=2018-07-30T09:56:19.142418Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-zktsz Pod phase: Running level=info timestamp=2018-07-30T09:55:59.628569Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T09:55:59.628875Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmicjv5m kind=Domain uid=b915ae87-93de-11e8-97bf-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-30T09:55:59.664860Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T09:55:59.667814Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmicjv5m kind= uid=b915ae87-93de-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:55:59.667936Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmicjv5m kind= uid=b915ae87-93de-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T09:55:59.728599Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmicjv5m kind= uid=b915ae87-93de-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:55:59.728784Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmicjv5m kind= uid=b915ae87-93de-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:55:59.736962Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmicjv5m kind= uid=b915ae87-93de-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:56:00.456435Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmicjv5m kind= uid=b915ae87-93de-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T09:56:00.456823Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmicjv5m kind= uid=b915ae87-93de-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T09:56:00.459051Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmicjv5m kind= uid=b915ae87-93de-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmicjv5m" level=info timestamp=2018-07-30T09:56:00.525329Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmicjv5m kind= uid=b915ae87-93de-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:56:00.525634Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmicjv5m kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T09:56:00.525794Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmicjv5m kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T09:56:00.526424Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmicjv5m kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." • Failure [378.884 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 Using virtctl interface /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:435 should stop a VirtualMachineInstance once [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:467 Timed out after 360.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:490 ------------------------------ STEP: getting an VMI STEP: Invoking virtctl stop STEP: Ensuring VMI is running STEP: Ensuring VMI is not running • ------------------------------ • [SLOW TEST:17.599 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 should start it /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:76 ------------------------------ Pod name: disks-images-provider-9tn5b Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-tqbmm Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-frt4f Pod phase: Running 2018/07/30 10:00:44 http: TLS handshake error from 10.244.1.1:42220: EOF 2018/07/30 10:00:54 http: TLS handshake error from 10.244.1.1:42226: EOF 2018/07/30 10:01:04 http: TLS handshake error from 10.244.1.1:42232: EOF 2018/07/30 10:01:14 http: TLS handshake error from 10.244.1.1:42238: EOF 2018/07/30 10:01:24 http: TLS handshake error from 10.244.1.1:42244: EOF 2018/07/30 10:01:34 http: TLS handshake error from 10.244.1.1:42250: EOF 2018/07/30 10:01:44 http: TLS handshake error from 10.244.1.1:42256: EOF 2018/07/30 10:01:54 http: TLS handshake error from 10.244.1.1:42262: EOF 2018/07/30 10:02:04 http: TLS handshake error from 10.244.1.1:42268: EOF 2018/07/30 10:02:14 http: TLS handshake error from 10.244.1.1:42274: EOF 2018/07/30 10:02:24 http: TLS handshake error from 10.244.1.1:42280: EOF 2018/07/30 10:02:34 http: TLS handshake error from 10.244.1.1:42286: EOF 2018/07/30 10:02:44 http: TLS handshake error from 10.244.1.1:42292: EOF 2018/07/30 10:02:54 http: TLS handshake error from 10.244.1.1:42298: EOF 2018/07/30 10:03:04 http: TLS handshake error from 10.244.1.1:42304: EOF Pod name: virt-api-7d79764579-hmhf7 Pod phase: Running 2018/07/30 10:02:03 http: TLS handshake error from 10.244.0.1:53888: EOF 2018/07/30 10:02:13 http: TLS handshake error from 10.244.0.1:53912: EOF level=info timestamp=2018-07-30T10:02:15.902113Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:02:23 http: TLS handshake error from 10.244.0.1:53936: EOF level=info timestamp=2018-07-30T10:02:25.281643Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:02:25.366783Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:02:33 http: TLS handshake error from 10.244.0.1:53960: EOF 2018/07/30 10:02:43 http: TLS handshake error from 10.244.0.1:53984: EOF level=info timestamp=2018-07-30T10:02:45.964017Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:02:53 http: TLS handshake error from 10.244.0.1:54008: EOF level=info timestamp=2018-07-30T10:02:53.740550Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T10:02:53.742003Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T10:02:55.444052Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:02:55.499949Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:03:03 http: TLS handshake error from 10.244.0.1:54032: EOF Pod name: virt-controller-7d57d96b65-d9qsl Pod phase: Running level=info timestamp=2018-07-30T09:10:40.160963Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-rs8jn Pod phase: Running level=info timestamp=2018-07-30T09:56:18.852074Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:56:18.852142Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-30T09:56:18.975189Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:56:18.975286Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: false" level=info timestamp=2018-07-30T09:56:19.044570Z pos=vm.go:322 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Dispatching delete event" level=info timestamp=2018-07-30T09:56:19.044903Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-30T09:56:19.044988Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisldpd kind= uid=c389a0b4-93de-11e8-97bf-525500d15501 msg="Creating or the VirtualMachineInstance: false" level=info timestamp=2018-07-30T10:02:19.706209Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid76g4 kind= uid=a5667fc6-93df-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:02:19.707621Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid76g4 kind= uid=a5667fc6-93df-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:02:19.768326Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid76g4\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid76g4" level=info timestamp=2018-07-30T10:02:19.996193Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimfjh9 kind= uid=a595cc05-93df-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:02:19.996399Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimfjh9 kind= uid=a595cc05-93df-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:02:37.735959Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijrmsz kind= uid=b028d022-93df-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:02:37.736162Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijrmsz kind= uid=b028d022-93df-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:02:37.818473Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijrmsz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijrmsz" Pod name: virt-handler-k84xh Pod phase: Running level=info timestamp=2018-07-30T09:56:18.827033Z pos=vm.go:251 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmisldpd\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmisldpd" level=info timestamp=2018-07-30T09:56:18.827165Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:56:18.837282Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:56:18.837429Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T09:56:18.843474Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T09:56:19.045756Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T09:56:19.045939Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T09:56:19.047844Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmisldpd" level=info timestamp=2018-07-30T09:56:19.142418Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:02:19.441365Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:02:19.443217Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:02:19.448252Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind= uid=c38c9adc-93de-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:02:19.503609Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T10:02:19.505288Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T10:02:19.508323Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmisldpd kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-zktsz Pod phase: Running level=info timestamp=2018-07-30T10:02:54.472016Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type ADDED" level=info timestamp=2018-07-30T10:02:54.472399Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmijrmsz kind=Domain uid=b028d022-93df-11e8-97bf-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-30T10:02:55.097045Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:02:55.097288Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmijrmsz kind=Domain uid=b028d022-93df-11e8-97bf-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-30T10:02:55.126150Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmijrmsz kind= uid=b028d022-93df-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:02:55.126322Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmijrmsz kind= uid=b028d022-93df-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T10:02:55.137077Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:02:55.176944Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmijrmsz kind= uid=b028d022-93df-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:02:55.177280Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmijrmsz kind= uid=b028d022-93df-11e8-97bf-525500d15501 msg="No update processing required" level=error timestamp=2018-07-30T10:02:55.191978Z pos=vm.go:404 component=virt-handler namespace=kubevirt-test-default name=testvmijrmsz kind= uid=b028d022-93df-11e8-97bf-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijrmsz\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-07-30T10:02:55.192177Z pos=vm.go:251 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijrmsz\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmijrmsz" level=info timestamp=2018-07-30T10:02:55.192394Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmijrmsz kind= uid=b028d022-93df-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:02:55.201123Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmijrmsz kind= uid=b028d022-93df-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:02:55.201368Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmijrmsz kind= uid=b028d022-93df-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:02:55.211320Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmijrmsz kind= uid=b028d022-93df-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmijrmsz-lwzxn Pod phase: Running • Failure [29.406 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 should attach virt-launcher to it [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:82 Timed out after 11.066s. Expected : to contain substring : Found PID for /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:92 ------------------------------ level=info timestamp=2018-07-30T10:02:38.239067Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmijrmsz kind=VirtualMachineInstance uid=b028d022-93df-11e8-97bf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmijrmsz-lwzxn" level=info timestamp=2018-07-30T10:02:54.091883Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmijrmsz kind=VirtualMachineInstance uid=b028d022-93df-11e8-97bf-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmijrmsz-lwzxn" level=info timestamp=2018-07-30T10:02:55.625767Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmijrmsz kind=VirtualMachineInstance uid=b028d022-93df-11e8-97bf-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-30T10:02:55.657523Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmijrmsz kind=VirtualMachineInstance uid=b028d022-93df-11e8-97bf-525500d15501 msg="VirtualMachineInstance started." STEP: Getting virt-launcher logs ••••2018/07/30 06:03:43 read closing down: EOF ------------------------------ • [SLOW TEST:35.239 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 with boot order /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:170 should be able to boot from selected disk /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 Alpine as first boot /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ 2018/07/30 06:04:10 read closing down: EOF • [SLOW TEST:27.065 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 with boot order /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:170 should be able to boot from selected disk /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 Cirros as first boot /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:17.217 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 with user-data /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:201 without k8s secret /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:202 should retry starting the VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:203 ------------------------------ • [SLOW TEST:17.240 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 with user-data /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:201 without k8s secret /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:202 should log warning and proceed once the secret is there /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:233 ------------------------------ • [SLOW TEST:40.143 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 when virt-launcher crashes /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:281 should be stopped and have Failed phase /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:282 ------------------------------ Pod name: disks-images-provider-9tn5b Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-tqbmm Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-frt4f Pod phase: Running 2018/07/30 10:04:24 http: TLS handshake error from 10.244.1.1:42356: EOF 2018/07/30 10:04:34 http: TLS handshake error from 10.244.1.1:42362: EOF 2018/07/30 10:04:44 http: TLS handshake error from 10.244.1.1:42368: EOF 2018/07/30 10:04:54 http: TLS handshake error from 10.244.1.1:42374: EOF 2018/07/30 10:05:04 http: TLS handshake error from 10.244.1.1:42380: EOF 2018/07/30 10:05:14 http: TLS handshake error from 10.244.1.1:42386: EOF 2018/07/30 10:05:24 http: TLS handshake error from 10.244.1.1:42392: EOF 2018/07/30 10:05:34 http: TLS handshake error from 10.244.1.1:42398: EOF 2018/07/30 10:05:44 http: TLS handshake error from 10.244.1.1:42404: EOF 2018/07/30 10:05:54 http: TLS handshake error from 10.244.1.1:42412: EOF 2018/07/30 10:06:04 http: TLS handshake error from 10.244.1.1:42418: EOF 2018/07/30 10:06:14 http: TLS handshake error from 10.244.1.1:42424: EOF 2018/07/30 10:06:24 http: TLS handshake error from 10.244.1.1:42430: EOF 2018/07/30 10:06:34 http: TLS handshake error from 10.244.1.1:42436: EOF 2018/07/30 10:06:44 http: TLS handshake error from 10.244.1.1:42442: EOF Pod name: virt-api-7d79764579-hmhf7 Pod phase: Running level=info timestamp=2018-07-30T10:05:45.972013Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:05:53 http: TLS handshake error from 10.244.0.1:54452: EOF level=info timestamp=2018-07-30T10:05:56.305255Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:05:56.364659Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:06:03 http: TLS handshake error from 10.244.0.1:54476: EOF 2018/07/30 10:06:13 http: TLS handshake error from 10.244.0.1:54500: EOF level=info timestamp=2018-07-30T10:06:15.954315Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:06:23 http: TLS handshake error from 10.244.0.1:54524: EOF level=info timestamp=2018-07-30T10:06:26.458616Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:06:26.502460Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:06:33 http: TLS handshake error from 10.244.0.1:54548: EOF 2018/07/30 10:06:43 http: TLS handshake error from 10.244.0.1:54572: EOF level=info timestamp=2018-07-30T10:06:45.721165Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T10:06:45.738322Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T10:06:46.241998Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-d9qsl Pod phase: Running level=info timestamp=2018-07-30T09:10:40.160963Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-rs8jn Pod phase: Running level=info timestamp=2018-07-30T10:03:43.797328Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigwvtv kind= uid=d7855914-93df-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:03:44.270005Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmigwvtv\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmigwvtv" level=info timestamp=2018-07-30T10:04:10.722664Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipjb6j kind= uid=e792cdfa-93df-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:04:10.723070Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipjb6j kind= uid=e792cdfa-93df-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:04:10.778135Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipjb6j\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipjb6j" level=info timestamp=2018-07-30T10:04:27.816169Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiff7j7 kind= uid=f1c4be6b-93df-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:04:27.817365Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiff7j7 kind= uid=f1c4be6b-93df-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:04:27.945173Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiff7j7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiff7j7" level=info timestamp=2018-07-30T10:04:27.963300Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiff7j7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiff7j7" level=info timestamp=2018-07-30T10:04:45.041281Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixpknt kind= uid=fc0a67a4-93df-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:04:45.041435Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixpknt kind= uid=fc0a67a4-93df-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:05:25.293298Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv72pw kind= uid=14079315-93e0-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:05:25.293499Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv72pw kind= uid=14079315-93e0-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:05:25.383115Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiv72pw\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiv72pw" level=info timestamp=2018-07-30T10:05:25.401099Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiv72pw\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiv72pw" Pod name: virt-handler-k84xh Pod phase: Running level=info timestamp=2018-07-30T10:04:27.500580Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmipjb6j kind= uid=e792cdfa-93df-11e8-97bf-525500d15501 msg="Processing vmi update" level=error timestamp=2018-07-30T10:04:27.510205Z pos=vm.go:397 component=virt-handler namespace=kubevirt-test-default name=testvmipjb6j kind= uid=e792cdfa-93df-11e8-97bf-525500d15501 reason="secrets \"nonexistent\" not found" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-07-30T10:04:27.510482Z pos=vm.go:251 component=virt-handler reason="secrets \"nonexistent\" not found" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmipjb6j" level=info timestamp=2018-07-30T10:04:27.530954Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmipjb6j kind= uid=e792cdfa-93df-11e8-97bf-525500d15501 msg="Processing vmi update" level=error timestamp=2018-07-30T10:04:27.540017Z pos=vm.go:397 component=virt-handler namespace=kubevirt-test-default name=testvmipjb6j kind= uid=e792cdfa-93df-11e8-97bf-525500d15501 reason="secrets \"nonexistent\" not found" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-07-30T10:04:27.540347Z pos=vm.go:251 component=virt-handler reason="secrets \"nonexistent\" not found" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmipjb6j" level=info timestamp=2018-07-30T10:04:27.580831Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmipjb6j kind= uid=e792cdfa-93df-11e8-97bf-525500d15501 msg="Processing vmi update" level=error timestamp=2018-07-30T10:04:27.593464Z pos=vm.go:397 component=virt-handler namespace=kubevirt-test-default name=testvmipjb6j kind= uid=e792cdfa-93df-11e8-97bf-525500d15501 reason="secrets \"nonexistent\" not found" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-07-30T10:04:27.594309Z pos=vm.go:251 component=virt-handler reason="secrets \"nonexistent\" not found" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmipjb6j" level=info timestamp=2018-07-30T10:04:27.621974Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmipjb6j kind= uid=e792cdfa-93df-11e8-97bf-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-30T10:04:27.661883Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmipjb6j kind= uid=e792cdfa-93df-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:04:27.680769Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmipjb6j kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-30T10:04:27.681099Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmipjb6j kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:04:27.681831Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmipjb6j kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-30T10:04:27.681938Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmipjb6j kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-zktsz Pod phase: Running level=info timestamp=2018-07-30T10:05:48.313775Z pos=virt-handler.go:87 component=virt-handler hostname=node02 level=info timestamp=2018-07-30T10:05:48.328821Z pos=vm.go:210 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-07-30T10:05:48.330366Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-07-30T10:05:48.330651Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmiv72pw_sock" level=info timestamp=2018-07-30T10:05:48.403262Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmiv72pw kind=Domain uid= msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-30T10:05:48.429991Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-30T10:05:48.448075Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-30T10:05:48.449492Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" level=info timestamp=2018-07-30T10:05:48.532978Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmiv72pw kind=VirtualMachineInstance uid=14079315-93e0-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:05:48.559014Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiv72pw kind=VirtualMachineInstance uid=14079315-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmiv72pw-z9ndr Pod phase: Running Pod name: vmi-killer2x9kk Pod phase: Succeeded Pod name: vmi-killerqm2g8 Pod phase: Succeeded • Failure [83.218 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 when virt-handler crashes /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:305 should recover and continue management [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:306 Expected : Running to equal : Failed /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:332 ------------------------------ level=info timestamp=2018-07-30T10:05:25.800795Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmiv72pw kind=VirtualMachineInstance uid=14079315-93e0-11e8-97bf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmiv72pw-z9ndr" level=info timestamp=2018-07-30T10:05:41.368638Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmiv72pw kind=VirtualMachineInstance uid=14079315-93e0-11e8-97bf-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmiv72pw-z9ndr" level=info timestamp=2018-07-30T10:05:43.105033Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmiv72pw kind=VirtualMachineInstance uid=14079315-93e0-11e8-97bf-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-30T10:05:43.126436Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmiv72pw kind=VirtualMachineInstance uid=14079315-93e0-11e8-97bf-525500d15501 msg="VirtualMachineInstance started." STEP: Crashing the virt-handler STEP: Killing the VirtualMachineInstance level=info timestamp=2018-07-30T10:05:47.981394Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmiv72pw kind=VirtualMachineInstance uid=14079315-93e0-11e8-97bf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmiv72pw-z9ndr" level=info timestamp=2018-07-30T10:05:47.981635Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmiv72pw kind=VirtualMachineInstance uid=14079315-93e0-11e8-97bf-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmiv72pw-z9ndr" level=info timestamp=2018-07-30T10:05:47.982516Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmiv72pw kind=VirtualMachineInstance uid=14079315-93e0-11e8-97bf-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-30T10:05:47.982602Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmiv72pw kind=VirtualMachineInstance uid=14079315-93e0-11e8-97bf-525500d15501 msg="VirtualMachineInstance started." level=info timestamp=2018-07-30T10:05:49.043824Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmiv72pw kind=VirtualMachineInstance uid=14079315-93e0-11e8-97bf-525500d15501 msg="VirtualMachineInstance defined." STEP: Checking that VirtualMachineInstance has 'Failed' phase • [SLOW TEST:57.042 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 when virt-handler is responsive /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:336 should indicate that a node is ready for vmis /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:337 ------------------------------ • [SLOW TEST:54.299 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 when virt-handler is not responsive /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:367 the node controller should react /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:406 ------------------------------ • [SLOW TEST:18.312 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 with node tainted /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:459 the vmi with tolerations should be scheduled /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:481 ------------------------------ • ------------------------------ S [SKIPPING] [0.292 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 with non default namespace /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:531 should log libvirt start and stop lifecycle events of the domain /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 kubevirt-test-default [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Skip log query tests for JENKINS ci test environment /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:536 ------------------------------ S [SKIPPING] [0.143 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 with non default namespace /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:531 should log libvirt start and stop lifecycle events of the domain /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 kubevirt-test-alternative [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Skip log query tests for JENKINS ci test environment /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:536 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.171 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 VirtualMachineInstance Emulation Mode /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:592 should enable emulation in virt-launcher [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:604 Software emulation is not enabled on this cluster /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:600 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.161 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 VirtualMachineInstance Emulation Mode /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:592 should be reflected in domain XML [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:641 Software emulation is not enabled on this cluster /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:600 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.153 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 VirtualMachineInstance Emulation Mode /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:592 should request a TUN device but not KVM [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:685 Software emulation is not enabled on this cluster /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:600 ------------------------------ •••• ------------------------------ • [SLOW TEST:35.763 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Delete a VirtualMachineInstance's Pod /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:837 should result in the VirtualMachineInstance moving to a finalized state /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:838 ------------------------------ • [SLOW TEST:20.449 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Delete a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:869 with an active pod. /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:870 should result in pod being terminated /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:871 ------------------------------ Pod name: disks-images-provider-9tn5b Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-tqbmm Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-frt4f Pod phase: Running 2018/07/30 10:08:24 http: TLS handshake error from 10.244.1.1:42504: EOF 2018/07/30 10:08:34 http: TLS handshake error from 10.244.1.1:42510: EOF 2018/07/30 10:08:44 http: TLS handshake error from 10.244.1.1:42516: EOF 2018/07/30 10:08:54 http: TLS handshake error from 10.244.1.1:42522: EOF 2018/07/30 10:09:04 http: TLS handshake error from 10.244.1.1:42528: EOF 2018/07/30 10:09:14 http: TLS handshake error from 10.244.1.1:42534: EOF 2018/07/30 10:09:24 http: TLS handshake error from 10.244.1.1:42540: EOF 2018/07/30 10:09:34 http: TLS handshake error from 10.244.1.1:42546: EOF 2018/07/30 10:09:44 http: TLS handshake error from 10.244.1.1:42552: EOF 2018/07/30 10:09:54 http: TLS handshake error from 10.244.1.1:42558: EOF 2018/07/30 10:10:04 http: TLS handshake error from 10.244.1.1:42564: EOF 2018/07/30 10:10:14 http: TLS handshake error from 10.244.1.1:42570: EOF 2018/07/30 10:10:24 http: TLS handshake error from 10.244.1.1:42576: EOF 2018/07/30 10:10:34 http: TLS handshake error from 10.244.1.1:42582: EOF 2018/07/30 10:10:44 http: TLS handshake error from 10.244.1.1:42588: EOF Pod name: virt-api-7d79764579-hmhf7 Pod phase: Running level=info timestamp=2018-07-30T10:09:45.935515Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:09:53 http: TLS handshake error from 10.244.0.1:55032: EOF level=info timestamp=2018-07-30T10:09:53.450540Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T10:09:53.493630Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T10:09:57.509442Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:09:57.524821Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:10:03 http: TLS handshake error from 10.244.0.1:55056: EOF 2018/07/30 10:10:13 http: TLS handshake error from 10.244.0.1:55080: EOF level=info timestamp=2018-07-30T10:10:16.035554Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:10:23 http: TLS handshake error from 10.244.0.1:55104: EOF level=info timestamp=2018-07-30T10:10:27.697385Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:10:27.700272Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:10:33 http: TLS handshake error from 10.244.0.1:55128: EOF 2018/07/30 10:10:43 http: TLS handshake error from 10.244.0.1:55152: EOF level=info timestamp=2018-07-30T10:10:45.972234Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-d9qsl Pod phase: Running level=info timestamp=2018-07-30T09:10:40.160963Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-rs8jn Pod phase: Running level=info timestamp=2018-07-30T10:08:58.481180Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9q4dm kind= uid=9313ba6f-93e0-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:09:00.834831Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimg56m kind= uid=947fc068-93e0-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:09:00.835157Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimg56m kind= uid=947fc068-93e0-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:09:01.015616Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmimg56m\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmimg56m" level=info timestamp=2018-07-30T10:09:01.782201Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmil49w9 kind= uid=951160de-93e0-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:09:01.782349Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmil49w9 kind= uid=951160de-93e0-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:09:02.642471Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmip2kts kind= uid=9592ffd1-93e0-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:09:02.645385Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmip2kts kind= uid=9592ffd1-93e0-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:09:02.782659Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmip2kts\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmip2kts" level=info timestamp=2018-07-30T10:09:38.497335Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8kkkh kind= uid=aaeb1b71-93e0-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:09:38.497537Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8kkkh kind= uid=aaeb1b71-93e0-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:09:38.606854Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8kkkh\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8kkkh" level=info timestamp=2018-07-30T10:09:58.899069Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:09:58.899245Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:09:58.945152Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi84hhg\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi84hhg" Pod name: virt-handler-9gqdg Pod phase: Running level=info timestamp=2018-07-30T10:10:15.536394Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:10:15.539663Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind=Domain uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-30T10:10:15.561922Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:10:15.562330Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T10:10:15.578161Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:10:15.816245Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:10:15.816570Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:10:15.878510Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:10:16.051653Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:10:16.054101Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:10:16.589896Z pos=vm.go:533 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Signaled graceful shutdown for testvmi84hhg" level=info timestamp=2018-07-30T10:10:16.590203Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:10:16.590351Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:10:16.590381Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:10:16.590589Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-t48x9 Pod phase: Running level=info timestamp=2018-07-30T10:09:56.305426Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind= uid=aaeb1b71-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:09:56.305524Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind= uid=aaeb1b71-93e0-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T10:09:56.339586Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind= uid=aaeb1b71-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:09:56.339918Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind= uid=aaeb1b71-93e0-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:09:56.348641Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind= uid=aaeb1b71-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:09:56.448398Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind= uid=aaeb1b71-93e0-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:09:56.448595Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind= uid=aaeb1b71-93e0-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:09:56.449645Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind= uid=aaeb1b71-93e0-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmi8kkkh" level=info timestamp=2018-07-30T10:09:56.499681Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind= uid=aaeb1b71-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:09:58.634688Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind= uid=aaeb1b71-93e0-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:09:58.635089Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind= uid=aaeb1b71-93e0-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:09:58.636322Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind= uid=aaeb1b71-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:09:58.669811Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T10:09:58.669967Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T10:09:58.670472Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." • Failure [48.527 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Delete a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:869 with grace period greater than 0 /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:894 should run graceful shutdown [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:895 Timed out after 30.001s. Expected : level=info timestamp=2018-07-30T10:08:56.410243Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmiq7pvj kind= uid=87efba9e-93e0-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:08:57.195378Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type ADDED" level=info timestamp=2018-07-30T10:08:57.196009Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmiq7pvj kind=Domain uid=87efba9e-93e0-11e8-97bf-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-30T10:08:57.536583Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:08:57.537027Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmiq7pvj kind=Domain uid=87efba9e-93e0-11e8-97bf-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-30T10:08:57.562472Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:08:57.563922Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiq7pvj kind= uid=87efba9e-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:08:57.564177Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmiq7pvj kind= uid=87efba9e-93e0-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T10:08:57.657855Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiq7pvj kind= uid=87efba9e-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:08:57.658131Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmiq7pvj kind= uid=87efba9e-93e0-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:08:57.672633Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiq7pvj kind= uid=87efba9e-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:08:57.955769Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmiq7pvj kind= uid=87efba9e-93e0-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:08:57.956085Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiq7pvj kind= uid=87efba9e-93e0-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:08:57.961348Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmiq7pvj kind= uid=87efba9e-93e0-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmiq7pvj" level=info timestamp=2018-07-30T10:08:58.087413Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiq7pvj kind= uid=87efba9e-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:08:58.254120Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmiq7pvj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T10:08:58.254270Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiq7pvj kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T10:08:58.254790Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiq7pvj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:10:14.357575Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:10:15.092380Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type ADDED" level=info timestamp=2018-07-30T10:10:15.092662Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind=Domain uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-30T10:10:15.536394Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:10:15.539663Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind=Domain uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-30T10:10:15.561922Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:10:15.562330Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T10:10:15.578161Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:10:15.816245Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:10:15.816570Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:10:15.878510Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:10:16.051653Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:10:16.054101Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:10:16.589896Z pos=vm.go:533 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Signaled graceful shutdown for testvmi84hhg" level=info timestamp=2018-07-30T10:10:16.590203Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:10:16.590351Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:10:16.590381Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:10:16.590589Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." to contain substring : grace period expired, killing deleted VirtualMachineInstance testvmi84hhg /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:941 ------------------------------ STEP: Setting a VirtualMachineInstance termination grace period to 5 STEP: Creating the VirtualMachineInstance level=info timestamp=2018-07-30T10:09:59.384236Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi84hhg kind=VirtualMachineInstance uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi84hhg-pjb8f" level=info timestamp=2018-07-30T10:10:14.690037Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi84hhg kind=VirtualMachineInstance uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi84hhg-pjb8f" level=info timestamp=2018-07-30T10:10:16.086654Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi84hhg kind=VirtualMachineInstance uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-30T10:10:16.337669Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi84hhg kind=VirtualMachineInstance uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="VirtualMachineInstance started." STEP: Deleting the VirtualMachineInstance level=info timestamp=2018-07-30T10:10:17.084508Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmi84hhg kind=VirtualMachineInstance uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi84hhg-pjb8f" level=info timestamp=2018-07-30T10:10:17.084660Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmi84hhg kind=VirtualMachineInstance uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi84hhg-pjb8f" level=info timestamp=2018-07-30T10:10:17.085352Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmi84hhg kind=VirtualMachineInstance uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-30T10:10:17.085507Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmi84hhg kind=VirtualMachineInstance uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="VirtualMachineInstance started." level=info timestamp=2018-07-30T10:10:17.142585Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmi84hhg kind=VirtualMachineInstance uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="VirtualMachineInstance stopping" STEP: Checking that virt-handler logs VirtualMachineInstance graceful shutdown STEP: Checking that the VirtualMachineInstance does not exist after grace period Pod name: disks-images-provider-9tn5b Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-tqbmm Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-frt4f Pod phase: Running 2018/07/30 10:09:54 http: TLS handshake error from 10.244.1.1:42558: EOF 2018/07/30 10:10:04 http: TLS handshake error from 10.244.1.1:42564: EOF 2018/07/30 10:10:14 http: TLS handshake error from 10.244.1.1:42570: EOF 2018/07/30 10:10:24 http: TLS handshake error from 10.244.1.1:42576: EOF 2018/07/30 10:10:34 http: TLS handshake error from 10.244.1.1:42582: EOF 2018/07/30 10:10:44 http: TLS handshake error from 10.244.1.1:42588: EOF 2018/07/30 10:10:54 http: TLS handshake error from 10.244.1.1:42594: EOF 2018/07/30 10:11:04 http: TLS handshake error from 10.244.1.1:42600: EOF 2018/07/30 10:11:14 http: TLS handshake error from 10.244.1.1:42606: EOF 2018/07/30 10:11:24 http: TLS handshake error from 10.244.1.1:42612: EOF 2018/07/30 10:11:34 http: TLS handshake error from 10.244.1.1:42618: EOF 2018/07/30 10:11:44 http: TLS handshake error from 10.244.1.1:42624: EOF 2018/07/30 10:11:54 http: TLS handshake error from 10.244.1.1:42630: EOF 2018/07/30 10:12:04 http: TLS handshake error from 10.244.1.1:42636: EOF 2018/07/30 10:12:14 http: TLS handshake error from 10.244.1.1:42642: EOF Pod name: virt-api-7d79764579-hmhf7 Pod phase: Running 2018/07/30 10:11:13 http: TLS handshake error from 10.244.0.1:55224: EOF level=info timestamp=2018-07-30T10:11:15.949232Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:11:23 http: TLS handshake error from 10.244.0.1:55248: EOF level=info timestamp=2018-07-30T10:11:28.049070Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:11:28.051119Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:11:33 http: TLS handshake error from 10.244.0.1:55272: EOF 2018/07/30 10:11:43 http: TLS handshake error from 10.244.0.1:55296: EOF level=info timestamp=2018-07-30T10:11:46.013765Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:11:53 http: TLS handshake error from 10.244.0.1:55320: EOF level=info timestamp=2018-07-30T10:11:53.271547Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T10:11:53.283092Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T10:11:58.196202Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:11:58.198567Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:12:03 http: TLS handshake error from 10.244.0.1:55344: EOF 2018/07/30 10:12:13 http: TLS handshake error from 10.244.0.1:55368: EOF Pod name: virt-controller-7d57d96b65-d9qsl Pod phase: Running level=info timestamp=2018-07-30T09:10:40.160963Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-rs8jn Pod phase: Running level=info timestamp=2018-07-30T10:09:00.835157Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimg56m kind= uid=947fc068-93e0-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:09:01.015616Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmimg56m\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmimg56m" level=info timestamp=2018-07-30T10:09:01.782201Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmil49w9 kind= uid=951160de-93e0-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:09:01.782349Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmil49w9 kind= uid=951160de-93e0-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:09:02.642471Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmip2kts kind= uid=9592ffd1-93e0-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:09:02.645385Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmip2kts kind= uid=9592ffd1-93e0-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:09:02.782659Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmip2kts\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmip2kts" level=info timestamp=2018-07-30T10:09:38.497335Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8kkkh kind= uid=aaeb1b71-93e0-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:09:38.497537Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8kkkh kind= uid=aaeb1b71-93e0-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:09:38.606854Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8kkkh\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8kkkh" level=info timestamp=2018-07-30T10:09:58.899069Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:09:58.899245Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:09:58.945152Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi84hhg\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi84hhg" level=info timestamp=2018-07-30T10:10:47.504780Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9w2fd kind= uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:10:47.506524Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9w2fd kind= uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-9gqdg Pod phase: Running level=info timestamp=2018-07-30T10:10:15.816570Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:10:15.878510Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:10:16.051653Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:10:16.054101Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:10:16.589896Z pos=vm.go:533 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Signaled graceful shutdown for testvmi84hhg" level=info timestamp=2018-07-30T10:10:16.590203Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:10:16.590351Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:10:16.590381Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:10:16.590589Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:10:47.274974Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:10:47.276962Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:10:47.279286Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind= uid=b71d76df-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:10:47.318048Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T10:10:47.318309Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T10:10:47.320236Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi84hhg kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-t48x9 Pod phase: Running level=info timestamp=2018-07-30T10:09:58.636322Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind= uid=aaeb1b71-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:09:58.669811Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T10:09:58.669967Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T10:09:58.670472Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi8kkkh kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:11:03.787345Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmi9w2fd kind= uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:11:04.707070Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type ADDED" level=info timestamp=2018-07-30T10:11:04.707699Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmi9w2fd kind=Domain uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-30T10:11:04.973673Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:11:04.973945Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmi9w2fd kind=Domain uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-30T10:11:05.012299Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9w2fd kind= uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:11:05.012446Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmi9w2fd kind= uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T10:11:05.016436Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:11:05.061587Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9w2fd kind= uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:11:05.061772Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmi9w2fd kind= uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:11:05.070032Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9w2fd kind= uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi9w2fd-vzj78 Pod phase: Running Pod name: vmi-killernq85g Pod phase: Succeeded • Failure [88.512 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Killed VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:946 should be in Failed phase [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:947 Expected : Running to equal : Failed /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:970 ------------------------------ STEP: Starting a VirtualMachineInstance level=info timestamp=2018-07-30T10:10:48.009583Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi9w2fd kind=VirtualMachineInstance uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi9w2fd-vzj78" level=info timestamp=2018-07-30T10:11:04.217012Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi9w2fd kind=VirtualMachineInstance uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi9w2fd-vzj78" level=info timestamp=2018-07-30T10:11:05.494555Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi9w2fd kind=VirtualMachineInstance uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-30T10:11:05.521564Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi9w2fd kind=VirtualMachineInstance uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="VirtualMachineInstance started." STEP: Killing the VirtualMachineInstance level=info timestamp=2018-07-30T10:11:15.678556Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmi9w2fd kind=VirtualMachineInstance uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi9w2fd-vzj78" level=info timestamp=2018-07-30T10:11:15.678745Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmi9w2fd kind=VirtualMachineInstance uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi9w2fd-vzj78" level=info timestamp=2018-07-30T10:11:15.679426Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmi9w2fd kind=VirtualMachineInstance uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-30T10:11:15.679596Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmi9w2fd kind=VirtualMachineInstance uid=d41054ff-93e0-11e8-97bf-525500d15501 msg="VirtualMachineInstance started." STEP: Checking that the VirtualMachineInstance has 'Failed' phase • [SLOW TEST:82.570 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Killed VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:946 should be left alone by virt-handler /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:974 ------------------------------ • ------------------------------ • [SLOW TEST:7.955 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should scale /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 to five, to six and then to zero replicas /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ •• ------------------------------ • [SLOW TEST:17.846 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should update readyReplicas once VMIs are up /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:157 ------------------------------ • [SLOW TEST:8.681 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should remove VMIs once it is marked for deletion /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:169 ------------------------------ • ------------------------------ • [SLOW TEST:5.598 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should not scale when paused and scale when resume /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:223 ------------------------------ • ------------------------------ • [SLOW TEST:19.015 seconds] HookSidecars /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:40 VMI definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:58 with SM BIOS hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:59 should successfully start with hook sidecar annotation /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:60 ------------------------------ • [SLOW TEST:19.097 seconds] HookSidecars /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:40 VMI definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:58 with SM BIOS hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:59 should call Collect and OnDefineDomain on the hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:67 ------------------------------ • [SLOW TEST:20.726 seconds] HookSidecars /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:40 VMI definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:58 with SM BIOS hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:59 should update domain XML with SM BIOS properties /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:83 ------------------------------ • [SLOW TEST:37.388 seconds] LeaderElection /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:43 Start a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:53 when the controller pod is not running /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:54 should success /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:55 ------------------------------ Pod name: disks-images-provider-9tn5b Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-tqbmm Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-frt4f Pod phase: Running 2018/07/30 10:15:04 http: TLS handshake error from 10.244.1.1:42744: EOF 2018/07/30 10:15:14 http: TLS handshake error from 10.244.1.1:42750: EOF 2018/07/30 10:15:24 http: TLS handshake error from 10.244.1.1:42756: EOF 2018/07/30 10:15:34 http: TLS handshake error from 10.244.1.1:42762: EOF 2018/07/30 10:15:44 http: TLS handshake error from 10.244.1.1:42768: EOF 2018/07/30 10:15:54 http: TLS handshake error from 10.244.1.1:42774: EOF 2018/07/30 10:16:04 http: TLS handshake error from 10.244.1.1:42780: EOF 2018/07/30 10:16:14 http: TLS handshake error from 10.244.1.1:42786: EOF 2018/07/30 10:16:24 http: TLS handshake error from 10.244.1.1:42792: EOF 2018/07/30 10:16:34 http: TLS handshake error from 10.244.1.1:42798: EOF 2018/07/30 10:16:44 http: TLS handshake error from 10.244.1.1:42804: EOF 2018/07/30 10:16:54 http: TLS handshake error from 10.244.1.1:42810: EOF 2018/07/30 10:17:04 http: TLS handshake error from 10.244.1.1:42816: EOF 2018/07/30 10:17:14 http: TLS handshake error from 10.244.1.1:42822: EOF 2018/07/30 10:17:24 http: TLS handshake error from 10.244.1.1:42828: EOF Pod name: virt-api-7d79764579-hmhf7 Pod phase: Running level=info timestamp=2018-07-30T10:16:45.741612Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T10:16:45.961548Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=error timestamp=2018-07-30T10:16:49.480051Z pos=subresource.go:85 component=virt-api msg= 2018/07/30 10:16:49 http: response.WriteHeader on hijacked connection level=error timestamp=2018-07-30T10:16:49.480497Z pos=subresource.go:97 component=virt-api reason="read tcp 10.244.0.5:8443->10.244.0.1:58298: use of closed network connection" msg="error ecountered reading from websocket stream" level=info timestamp=2018-07-30T10:16:49.481497Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi65ql9/console proto=HTTP/1.1 statusCode=200 contentLength=0 2018/07/30 10:16:53 http: TLS handshake error from 10.244.0.1:56052: EOF level=info timestamp=2018-07-30T10:16:53.438386Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T10:16:53.442324Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T10:16:59.790678Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:16:59.842955Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:17:03 http: TLS handshake error from 10.244.0.1:56076: EOF 2018/07/30 10:17:13 http: TLS handshake error from 10.244.0.1:56100: EOF level=info timestamp=2018-07-30T10:17:15.921054Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:17:23 http: TLS handshake error from 10.244.0.1:56124: EOF Pod name: virt-controller-7d57d96b65-d46hg Pod phase: Running level=info timestamp=2018-07-30T10:15:33.059152Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-d9qsl Pod phase: Running level=info timestamp=2018-07-30T10:15:49.268022Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-07-30T10:15:49.268136Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer limitrangeInformer" level=info timestamp=2018-07-30T10:15:49.268222Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-07-30T10:15:49.268302Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-07-30T10:15:49.268420Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-07-30T10:15:49.269303Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-07-30T10:15:49.272706Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-07-30T10:15:49.273508Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-07-30T10:15:49.273973Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-07-30T10:15:49.274183Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-07-30T10:15:50.080353Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin2dq6 kind= uid=88762577-93e1-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:15:50.082134Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin2dq6 kind= uid=88762577-93e1-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:16:07.170133Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi65ql9 kind= uid=92a8b184-93e1-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:16:07.172054Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi65ql9 kind= uid=92a8b184-93e1-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:16:07.264194Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi65ql9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi65ql9" Pod name: virt-handler-9gqdg Pod phase: Running level=info timestamp=2018-07-30T10:14:08.043584Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:14:08.046588Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmi6z7pkxfcrd kind=Domain uid=42000700-93e1-11e8-97bf-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-30T10:14:08.089449Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:14:08.097404Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6z7pkxfcrd kind= uid=42000700-93e1-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:14:08.097671Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmi6z7pkxfcrd kind= uid=42000700-93e1-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T10:14:08.266613Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6z7pkxfcrd kind= uid=42000700-93e1-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:14:08.277525Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmi6z7pkxfcrd kind= uid=42000700-93e1-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:14:08.282921Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6z7pkxfcrd kind= uid=42000700-93e1-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:14:09.590304Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmi6z7pkxfcrd kind= uid=42000700-93e1-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:14:09.590453Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi6z7pkxfcrd kind= uid=42000700-93e1-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:14:09.595219Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmi6z7pkxfcrd kind= uid=42000700-93e1-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmi6z7pkxfcrd" level=info timestamp=2018-07-30T10:14:09.628459Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6z7pkxfcrd kind= uid=42000700-93e1-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:14:09.724611Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmi6z7pkxfcrd kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T10:14:09.724894Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi6z7pkxfcrd kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T10:14:09.725105Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6z7pkxfcrd kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-t48x9 Pod phase: Running level=info timestamp=2018-07-30T10:16:06.975827Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmin2dq6 kind= uid=88762577-93e1-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:16:06.983293Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmin2dq6 kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T10:16:06.983410Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmin2dq6 kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T10:16:06.983890Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmin2dq6 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:16:21.967228Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmi65ql9 kind= uid=92a8b184-93e1-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:16:22.968337Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type ADDED" level=info timestamp=2018-07-30T10:16:22.968604Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmi65ql9 kind=Domain uid=92a8b184-93e1-11e8-97bf-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-30T10:16:23.836911Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:16:23.837278Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmi65ql9 kind=Domain uid=92a8b184-93e1-11e8-97bf-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-30T10:16:23.877501Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi65ql9 kind= uid=92a8b184-93e1-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:16:23.878156Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmi65ql9 kind= uid=92a8b184-93e1-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T10:16:23.888640Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:16:23.902572Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi65ql9 kind= uid=92a8b184-93e1-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:16:23.904361Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmi65ql9 kind= uid=92a8b184-93e1-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:16:23.918131Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi65ql9 kind= uid=92a8b184-93e1-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi65ql9-6rl6z Pod phase: Running 2018/07/30 06:17:25 read closing down: EOF • Failure [78.248 seconds] Health Monitoring /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:37 A VirtualMachineInstance with a watchdog device /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:56 should be shut down when the watchdog expires [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:57 Timed out after 40.013s. Expected : Running to equal : Failed /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:85 ------------------------------ STEP: Starting a VirtualMachineInstance level=info timestamp=2018-07-30T10:16:07.873232Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi65ql9 kind=VirtualMachineInstance uid=92a8b184-93e1-11e8-97bf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi65ql9-6rl6z" level=info timestamp=2018-07-30T10:16:22.415785Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi65ql9 kind=VirtualMachineInstance uid=92a8b184-93e1-11e8-97bf-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi65ql9-6rl6z" level=info timestamp=2018-07-30T10:16:24.342831Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi65ql9 kind=VirtualMachineInstance uid=92a8b184-93e1-11e8-97bf-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-30T10:16:24.351362Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi65ql9 kind=VirtualMachineInstance uid=92a8b184-93e1-11e8-97bf-525500d15501 msg="VirtualMachineInstance started." STEP: Expecting the VirtualMachineInstance console STEP: Killing the watchdog device STEP: Checking that the VirtualMachineInstance has Failed status S [SKIPPING] in Spec Setup (BeforeEach) [0.002 seconds] Templates /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:42 Launching VMI from VM Template [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:60 with given Fedora Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:193 should succeed to generate a VM JSON file using oc-process command /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:150 Skip test that requires oc binary /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1387 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.002 seconds] Templates /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:42 Launching VMI from VM Template [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:60 with given Fedora Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:193 with given VM JSON from the Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:152 should succeed to create a VM using oc-create command /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:156 Skip test that requires oc binary /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1387 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.002 seconds] Templates /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:42 Launching VMI from VM Template [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:60 with given Fedora Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:193 with given VM JSON from the Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:152 with given VM from the VM JSON /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:158 should succeed to launch a VMI using oc-patch command /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:161 Skip test that requires oc binary /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1387 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.002 seconds] Templates /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:42 Launching VMI from VM Template [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:60 with given Fedora Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:193 with given VM JSON from the Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:152 with given VM from the VM JSON /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:158 with given VMI from the VM /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:163 should succeed to terminate the VMI using oc-patch command /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:166 Skip test that requires oc binary /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1387 ------------------------------ • [SLOW TEST:14.906 seconds] User Access /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:33 With default kubevirt service accounts /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:41 should verify permissions are correct for view, edit, and admin /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 given a vmi /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:14.554 seconds] User Access /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:33 With default kubevirt service accounts /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:41 should verify permissions are correct for view, edit, and admin /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 given an vm /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:14.290 seconds] User Access /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:33 With default kubevirt service accounts /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:41 should verify permissions are correct for view, edit, and admin /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 given a vmi preset /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:14.711 seconds] User Access /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:33 With default kubevirt service accounts /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:41 should verify permissions are correct for view, edit, and admin /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 given a vmi replica set /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:18.231 seconds] VNC /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:54 with VNC connection /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:62 should allow accessing the VNC device /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:64 ------------------------------ ••2018/07/30 06:19:43 read closing down: EOF 2018/07/30 06:19:54 read closing down: EOF 2018/07/30 06:20:05 read closing down: EOF 2018/07/30 06:20:15 read closing down: EOF 2018/07/30 06:20:16 read closing down: EOF 2018/07/30 06:20:18 read closing down: EOF 2018/07/30 06:20:19 read closing down: EOF ------------------------------ 2018/07/30 06:20:19 read closing down: EOF • [SLOW TEST:97.010 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be able to reach /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 the Inbound VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ 2018/07/30 06:20:21 read closing down: EOF •2018/07/30 06:20:22 read closing down: EOF 2018/07/30 06:20:22 read closing down: EOF 2018/07/30 06:20:24 read closing down: EOF 2018/07/30 06:20:25 read closing down: EOF •2018/07/30 06:20:25 read closing down: EOF 2018/07/30 06:20:27 read closing down: EOF 2018/07/30 06:20:28 read closing down: EOF •2018/07/30 06:20:28 read closing down: EOF ------------------------------ • [SLOW TEST:5.188 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be reachable via the propagated IP from a Pod /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 on the same node from Pod /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ •••• ------------------------------ • [SLOW TEST:6.451 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 with a service matching the vmi exposed /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:287 should fail to reach the vmi if an invalid servicename is used /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:318 ------------------------------ •2018/07/30 06:21:32 read closing down: EOF 2018/07/30 06:21:33 read closing down: EOF ------------------------------ • [SLOW TEST:36.947 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with custom interface model /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:383 should expose the right device type to the guest /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:384 ------------------------------ 2018/07/30 06:21:34 read closing down: EOF •2018/07/30 06:21:34 read closing down: EOF 2018/07/30 06:22:10 read closing down: EOF 2018/07/30 06:22:11 read closing down: EOF ------------------------------ • [SLOW TEST:37.087 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with custom MAC address /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:417 should configure custom MAC address /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:418 ------------------------------ 2018/07/30 06:22:50 read closing down: EOF 2018/07/30 06:22:51 read closing down: EOF • [SLOW TEST:39.501 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with custom MAC address in non-conventional format /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:429 should configure custom MAC address /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:430 ------------------------------ 2018/07/30 06:23:25 read closing down: EOF 2018/07/30 06:23:26 read closing down: EOF • [SLOW TEST:34.716 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with custom MAC address and slirp interface /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:442 should configure custom MAC address /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:443 ------------------------------ 2018/07/30 06:24:05 read closing down: EOF • [SLOW TEST:40.301 seconds] Networking 2018/07/30 06:24:06 read closing down: EOF /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with disabled automatic attachment of interfaces /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:455 should not configure any external interfaces /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:456 ------------------------------ •••••••••••2018/07/30 06:24:58 read closing down: EOF Service cluster-ip-vmi successfully exposed for virtualmachineinstance testvmif5jcz ------------------------------ • [SLOW TEST:50.672 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on a VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:61 Expose ClusterIP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:68 Should expose a Cluster IP service on a VMI and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:71 ------------------------------ Service cluster-ip-target-vmi successfully exposed for virtualmachineinstance testvmif5jcz •Service node-port-vmi successfully exposed for virtualmachineinstance testvmif5jcz ------------------------------ • [SLOW TEST:10.354 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on a VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:61 Expose NodePort service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:124 Should expose a NodePort service on a VMI and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:129 ------------------------------ 2018/07/30 06:25:58 read closing down: EOF Service cluster-ip-udp-vmi successfully exposed for virtualmachineinstance testvmi9hbck • [SLOW TEST:46.931 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose UDP service on a VMI /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:166 Expose ClusterIP UDP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:173 Should expose a ClusterIP service on a VMI and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:177 ------------------------------ Service node-port-udp-vmi successfully exposed for virtualmachineinstance testvmi9hbck • [SLOW TEST:8.419 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose UDP service on a VMI /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:166 Expose NodePort UDP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:205 Should expose a NodePort service on a VMI and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:210 ------------------------------ 2018/07/30 06:27:00 read closing down: EOF 2018/07/30 06:27:10 read closing down: EOF Service cluster-ip-vmirs successfully exposed for vmirs replicasethlcjw • [SLOW TEST:66.484 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on a VMI replica set /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:253 Expose ClusterIP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:286 Should create a ClusterIP service on VMRS and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:290 ------------------------------ Service cluster-ip-vm successfully exposed for virtualmachine testvmijq8w4 VM testvmijq8w4 was scheduled to start 2018/07/30 06:27:59 read closing down: EOF • [SLOW TEST:47.720 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on an VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:318 Expose ClusterIP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:362 Connect to ClusterIP services that was set when VM was offline /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:363 ------------------------------ • [SLOW TEST:49.744 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 with a cirros image /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:67 should return that we are running cirros /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:68 ------------------------------ 2018/07/30 06:28:54 read closing down: EOF • [SLOW TEST:50.223 seconds] 2018/07/30 06:29:44 read closing down: EOF Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 with a fedora image /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:77 should return that we are running fedora /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:78 ------------------------------ 2018/07/30 06:30:21 read closing down: EOF 2018/07/30 06:30:22 read closing down: EOF 2018/07/30 06:30:23 read closing down: EOF 2018/07/30 06:30:24 read closing down: EOF • [SLOW TEST:39.630 seconds] Console 2018/07/30 06:30:24 read closing down: EOF /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 should be able to reconnect to console multiple times /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:87 ------------------------------ • [SLOW TEST:18.582 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 should wait until the virtual machine is in running state and return a stream interface /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:103 ------------------------------ • [SLOW TEST:30.308 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 should fail waiting for the virtual machine instance to be running /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:111 ------------------------------ • [SLOW TEST:30.527 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 should fail waiting for the expecter /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:134 ------------------------------ 2018/07/30 06:32:31 read closing down: EOF 2018/07/30 06:33:16 read closing down: EOF • [SLOW TEST:94.826 seconds] Slirp 2018/07/30 06:33:18 read closing down: EOF /root/go/src/kubevirt.io/kubevirt/tests/vmi_slirp_interface_test.go:39 should be able to /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 VirtualMachineInstance with slirp interface /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ •2018/07/30 06:33:21 read closing down: EOF 2018/07/30 06:33:58 read closing down: EOF ------------------------------ • [SLOW TEST:37.018 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 with Alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:71 should be successfully started /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 with Disk PVC /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:37.570 seconds] Storage 2018/07/30 06:34:35 read closing down: EOF /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 with Alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:71 should be successfully started /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 with CDRom PVC /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ Pod name: disks-images-provider-9tn5b Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-tqbmm Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-frt4f Pod phase: Running 2018/07/30 10:34:24 http: TLS handshake error from 10.244.1.1:43534: EOF 2018/07/30 10:34:34 http: TLS handshake error from 10.244.1.1:43540: EOF 2018/07/30 10:34:44 http: TLS handshake error from 10.244.1.1:43546: EOF 2018/07/30 10:34:54 http: TLS handshake error from 10.244.1.1:43552: EOF 2018/07/30 10:35:04 http: TLS handshake error from 10.244.1.1:43558: EOF 2018/07/30 10:35:14 http: TLS handshake error from 10.244.1.1:43564: EOF 2018/07/30 10:35:24 http: TLS handshake error from 10.244.1.1:43570: EOF 2018/07/30 10:35:34 http: TLS handshake error from 10.244.1.1:43576: EOF 2018/07/30 10:35:44 http: TLS handshake error from 10.244.1.1:43582: EOF 2018/07/30 10:35:54 http: TLS handshake error from 10.244.1.1:43588: EOF 2018/07/30 10:36:04 http: TLS handshake error from 10.244.1.1:43594: EOF 2018/07/30 10:36:14 http: TLS handshake error from 10.244.1.1:43600: EOF 2018/07/30 10:36:24 http: TLS handshake error from 10.244.1.1:43606: EOF 2018/07/30 10:36:34 http: TLS handshake error from 10.244.1.1:43612: EOF 2018/07/30 10:36:44 http: TLS handshake error from 10.244.1.1:43618: EOF Pod name: virt-api-7d79764579-hmhf7 Pod phase: Running level=info timestamp=2018-07-30T10:36:06.431407Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:36:06.434850Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:36:13 http: TLS handshake error from 10.244.0.1:59514: EOF level=info timestamp=2018-07-30T10:36:16.030548Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:36:23 http: TLS handshake error from 10.244.0.1:59538: EOF 2018/07/30 10:36:33 http: TLS handshake error from 10.244.0.1:59562: EOF level=info timestamp=2018-07-30T10:36:36.636521Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:36:36.641656Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:36:43 http: TLS handshake error from 10.244.0.1:59586: EOF level=info timestamp=2018-07-30T10:36:45.728205Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T10:36:45.746322Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T10:36:46.109279Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:36:53 http: TLS handshake error from 10.244.0.1:59610: EOF level=info timestamp=2018-07-30T10:36:53.259323Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T10:36:53.262543Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-d46hg Pod phase: Running level=info timestamp=2018-07-30T10:15:33.059152Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-d9qsl Pod phase: Running level=info timestamp=2018-07-30T10:30:43.068905Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmins66g kind= uid=9cbda769-93e3-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:30:43.070316Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmins66g kind= uid=9cbda769-93e3-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:31:13.586795Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5nlhj kind= uid=aeeba4e8-93e3-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:31:13.587423Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5nlhj kind= uid=aeeba4e8-93e3-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:31:43.645258Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilxvxd kind= uid=c0d429f8-93e3-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:31:43.645657Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilxvxd kind= uid=c0d429f8-93e3-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:32:31.216158Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifdtmh kind= uid=dd2c838a-93e3-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:32:31.220603Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifdtmh kind= uid=dd2c838a-93e3-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:33:22.125320Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihcxzr kind= uid=fb8a6919-93e3-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:33:22.127374Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihcxzr kind= uid=fb8a6919-93e3-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:33:58.222022Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw9q5b kind= uid=110cd424-93e4-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:33:58.225023Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw9q5b kind= uid=110cd424-93e4-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:33:58.412292Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiw9q5b\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiw9q5b" level=info timestamp=2018-07-30T10:34:35.886505Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4986j kind= uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:34:35.890223Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4986j kind= uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-9gqdg Pod phase: Running level=info timestamp=2018-07-30T10:23:07.692990Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind= uid=835f2d3f-93e2-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:24:06.190924Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind= uid=835f2d3f-93e2-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:24:06.192459Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind= uid=835f2d3f-93e2-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:24:06.200260Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind= uid=835f2d3f-93e2-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmi6hvxr" level=info timestamp=2018-07-30T10:24:06.292349Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind= uid=835f2d3f-93e2-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:24:06.423274Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind= uid=ef2f0528-93e1-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:24:06.424121Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind= uid=ef2f0528-93e1-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:24:06.425957Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind= uid=ef2f0528-93e1-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmiz2cnn" level=info timestamp=2018-07-30T10:24:06.521176Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T10:24:06.521269Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T10:24:06.522188Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:24:06.915250Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind= uid=ef2f0528-93e1-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:24:06.915410Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T10:24:06.915451Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T10:24:06.927096Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-t48x9 Pod phase: Running level=info timestamp=2018-07-30T10:34:54.333146Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi4986j kind= uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:34:54.333467Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmi4986j kind= uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T10:34:54.336623Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:34:54.391556Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi4986j kind= uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:34:54.391966Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmi4986j kind= uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="No update processing required" level=error timestamp=2018-07-30T10:34:54.413423Z pos=vm.go:404 component=virt-handler namespace=kubevirt-test-default name=testvmi4986j kind= uid=27820a8e-93e4-11e8-97bf-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4986j\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-07-30T10:34:54.413762Z pos=vm.go:251 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4986j\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4986j" level=info timestamp=2018-07-30T10:34:54.414037Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmi4986j kind= uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:34:54.436465Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi4986j kind= uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:34:54.436933Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmi4986j kind= uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:34:54.447406Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi4986j kind= uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:34:54.460440Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmi4986j kind= uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:34:54.460681Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi4986j kind= uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:34:54.465892Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmi4986j kind= uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmi4986j" level=info timestamp=2018-07-30T10:34:54.622301Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi4986j kind= uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." • Failure [139.340 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 with Alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:71 should be successfully started and stopped multiple times /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 with Disk PVC [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 120.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1065 ------------------------------ STEP: Starting and stopping the VirtualMachineInstance number of times STEP: Starting a VirtualMachineInstance STEP: Waiting until the VirtualMachineInstance will start level=info timestamp=2018-07-30T10:34:36.454822Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi4986j kind=VirtualMachineInstance uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi4986j-7fdsw" level=info timestamp=2018-07-30T10:34:53.034209Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi4986j kind=VirtualMachineInstance uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi4986j-7fdsw" level=info timestamp=2018-07-30T10:34:54.798463Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi4986j kind=VirtualMachineInstance uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-30T10:34:54.821677Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi4986j kind=VirtualMachineInstance uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="VirtualMachineInstance started." Pod name: disks-images-provider-9tn5b Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-tqbmm Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-frt4f Pod phase: Running 2018/07/30 10:36:44 http: TLS handshake error from 10.244.1.1:43618: EOF 2018/07/30 10:36:54 http: TLS handshake error from 10.244.1.1:43624: EOF 2018/07/30 10:37:04 http: TLS handshake error from 10.244.1.1:43630: EOF 2018/07/30 10:37:14 http: TLS handshake error from 10.244.1.1:43636: EOF 2018/07/30 10:37:24 http: TLS handshake error from 10.244.1.1:43642: EOF 2018/07/30 10:37:34 http: TLS handshake error from 10.244.1.1:43648: EOF 2018/07/30 10:37:44 http: TLS handshake error from 10.244.1.1:43654: EOF 2018/07/30 10:37:54 http: TLS handshake error from 10.244.1.1:43660: EOF 2018/07/30 10:38:04 http: TLS handshake error from 10.244.1.1:43666: EOF 2018/07/30 10:38:14 http: TLS handshake error from 10.244.1.1:43672: EOF 2018/07/30 10:38:24 http: TLS handshake error from 10.244.1.1:43678: EOF 2018/07/30 10:38:34 http: TLS handshake error from 10.244.1.1:43684: EOF 2018/07/30 10:38:44 http: TLS handshake error from 10.244.1.1:43690: EOF 2018/07/30 10:38:54 http: TLS handshake error from 10.244.1.1:43696: EOF 2018/07/30 10:39:04 http: TLS handshake error from 10.244.1.1:43702: EOF Pod name: virt-api-7d79764579-hmhf7 Pod phase: Running level=info timestamp=2018-07-30T10:38:07.171635Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:38:07.173567Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:38:13 http: TLS handshake error from 10.244.0.1:59802: EOF level=info timestamp=2018-07-30T10:38:16.047062Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:38:23 http: TLS handshake error from 10.244.0.1:59826: EOF 2018/07/30 10:38:33 http: TLS handshake error from 10.244.0.1:59850: EOF level=info timestamp=2018-07-30T10:38:37.377496Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:38:37.393293Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:38:43 http: TLS handshake error from 10.244.0.1:59874: EOF level=info timestamp=2018-07-30T10:38:46.027485Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:38:53 http: TLS handshake error from 10.244.0.1:59898: EOF 2018/07/30 10:39:03 http: TLS handshake error from 10.244.0.1:59922: EOF level=info timestamp=2018-07-30T10:39:07.595520Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:39:07.595519Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:39:13 http: TLS handshake error from 10.244.0.1:59946: EOF Pod name: virt-controller-7d57d96b65-d46hg Pod phase: Running level=info timestamp=2018-07-30T10:15:33.059152Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-d9qsl Pod phase: Running level=info timestamp=2018-07-30T10:31:13.586795Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5nlhj kind= uid=aeeba4e8-93e3-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:31:13.587423Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5nlhj kind= uid=aeeba4e8-93e3-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:31:43.645258Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilxvxd kind= uid=c0d429f8-93e3-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:31:43.645657Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilxvxd kind= uid=c0d429f8-93e3-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:32:31.216158Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifdtmh kind= uid=dd2c838a-93e3-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:32:31.220603Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifdtmh kind= uid=dd2c838a-93e3-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:33:22.125320Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihcxzr kind= uid=fb8a6919-93e3-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:33:22.127374Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihcxzr kind= uid=fb8a6919-93e3-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:33:58.222022Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw9q5b kind= uid=110cd424-93e4-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:33:58.225023Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw9q5b kind= uid=110cd424-93e4-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:33:58.412292Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiw9q5b\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiw9q5b" level=info timestamp=2018-07-30T10:34:35.886505Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4986j kind= uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:34:35.890223Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4986j kind= uid=27820a8e-93e4-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:36:55.363845Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifdx27 kind= uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:36:55.368859Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifdx27 kind= uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-9gqdg Pod phase: Running level=info timestamp=2018-07-30T10:23:07.692990Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind= uid=835f2d3f-93e2-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:24:06.190924Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind= uid=835f2d3f-93e2-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:24:06.192459Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind= uid=835f2d3f-93e2-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:24:06.200260Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind= uid=835f2d3f-93e2-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmi6hvxr" level=info timestamp=2018-07-30T10:24:06.292349Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind= uid=835f2d3f-93e2-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:24:06.423274Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind= uid=ef2f0528-93e1-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:24:06.424121Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind= uid=ef2f0528-93e1-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:24:06.425957Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind= uid=ef2f0528-93e1-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmiz2cnn" level=info timestamp=2018-07-30T10:24:06.521176Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T10:24:06.521269Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T10:24:06.522188Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:24:06.915250Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind= uid=ef2f0528-93e1-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:24:06.915410Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T10:24:06.915451Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T10:24:06.927096Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-t48x9 Pod phase: Running level=info timestamp=2018-07-30T10:37:11.095235Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmifdx27 kind= uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:37:12.250391Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type ADDED" level=info timestamp=2018-07-30T10:37:12.252043Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmifdx27 kind=Domain uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-30T10:37:12.987175Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:37:12.987493Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmifdx27 kind=Domain uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-30T10:37:13.026039Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:37:13.029813Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmifdx27 kind= uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:37:13.029969Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmifdx27 kind= uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T10:37:13.120441Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmifdx27 kind= uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:37:13.121428Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmifdx27 kind= uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:37:13.125186Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmifdx27 kind= uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:37:13.188168Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmifdx27 kind= uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:37:13.188298Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmifdx27 kind= uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:37:13.191297Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmifdx27 kind= uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmifdx27" level=info timestamp=2018-07-30T10:37:13.243848Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmifdx27 kind= uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." • Failure [138.841 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 with Alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:71 should be successfully started and stopped multiple times /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 with CDRom PVC [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 120.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1065 ------------------------------ STEP: Starting and stopping the VirtualMachineInstance number of times STEP: Starting a VirtualMachineInstance STEP: Waiting until the VirtualMachineInstance will start level=info timestamp=2018-07-30T10:36:56.036950Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmifdx27 kind=VirtualMachineInstance uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmifdx27-c5zqp" level=info timestamp=2018-07-30T10:37:11.498360Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmifdx27 kind=VirtualMachineInstance uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmifdx27-c5zqp" level=info timestamp=2018-07-30T10:37:13.538137Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmifdx27 kind=VirtualMachineInstance uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-30T10:37:13.552978Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmifdx27 kind=VirtualMachineInstance uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="VirtualMachineInstance started." 2018/07/30 06:39:59 read closing down: EOF • [SLOW TEST:45.494 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With an emptyDisk defined /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:113 should create a writeable emptyDisk with the right capacity /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:115 ------------------------------ • [SLOW TEST:49.887 seconds] 2018/07/30 06:40:49 read closing down: EOF Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With an emptyDisk defined and a specified serial number /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:163 should create a writeable emptyDisk with the specified serial number /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:165 ------------------------------ • [SLOW TEST:35.683 seconds] 2018/07/30 06:41:25 read closing down: EOF Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With ephemeral alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:205 should be successfully started /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:207 ------------------------------ Pod name: disks-images-provider-9tn5b Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-tqbmm Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-frt4f Pod phase: Running 2018/07/30 10:41:34 http: TLS handshake error from 10.244.1.1:43794: EOF 2018/07/30 10:41:44 http: TLS handshake error from 10.244.1.1:43800: EOF 2018/07/30 10:41:54 http: TLS handshake error from 10.244.1.1:43806: EOF 2018/07/30 10:42:04 http: TLS handshake error from 10.244.1.1:43812: EOF 2018/07/30 10:42:14 http: TLS handshake error from 10.244.1.1:43818: EOF 2018/07/30 10:42:24 http: TLS handshake error from 10.244.1.1:43824: EOF 2018/07/30 10:42:34 http: TLS handshake error from 10.244.1.1:43830: EOF 2018/07/30 10:42:44 http: TLS handshake error from 10.244.1.1:43836: EOF 2018/07/30 10:42:54 http: TLS handshake error from 10.244.1.1:43842: EOF 2018/07/30 10:43:04 http: TLS handshake error from 10.244.1.1:43848: EOF 2018/07/30 10:43:14 http: TLS handshake error from 10.244.1.1:43854: EOF 2018/07/30 10:43:24 http: TLS handshake error from 10.244.1.1:43860: EOF 2018/07/30 10:43:34 http: TLS handshake error from 10.244.1.1:43866: EOF 2018/07/30 10:43:44 http: TLS handshake error from 10.244.1.1:43872: EOF 2018/07/30 10:43:54 http: TLS handshake error from 10.244.1.1:43878: EOF Pod name: virt-api-7d79764579-hmhf7 Pod phase: Running 2018/07/30 10:42:53 http: TLS handshake error from 10.244.0.1:60504: EOF level=info timestamp=2018-07-30T10:42:53.654321Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-30T10:42:53.657194Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:43:03 http: TLS handshake error from 10.244.0.1:1024: EOF level=info timestamp=2018-07-30T10:43:09.072162Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:43:09.073134Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:43:13 http: TLS handshake error from 10.244.0.1:60552: EOF level=info timestamp=2018-07-30T10:43:16.017021Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:43:23 http: TLS handshake error from 10.244.0.1:60576: EOF 2018/07/30 10:43:33 http: TLS handshake error from 10.244.0.1:60600: EOF level=info timestamp=2018-07-30T10:43:39.239346Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-30T10:43:39.252150Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/30 10:43:43 http: TLS handshake error from 10.244.0.1:1025: EOF level=info timestamp=2018-07-30T10:43:45.995655Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/30 10:43:53 http: TLS handshake error from 10.244.0.1:1026: EOF Pod name: virt-controller-7d57d96b65-d46hg Pod phase: Running level=info timestamp=2018-07-30T10:15:33.059152Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-d9qsl Pod phase: Running level=info timestamp=2018-07-30T10:36:55.363845Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifdx27 kind= uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:36:55.368859Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifdx27 kind= uid=7aa3bdfe-93e4-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:39:13.876707Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7nsk kind= uid=cd335c8f-93e4-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:39:13.879938Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7nsk kind= uid=cd335c8f-93e4-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:39:14.035093Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7nsk\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7nsk" level=info timestamp=2018-07-30T10:39:59.498992Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihp946 kind= uid=e8642b49-93e4-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:39:59.500428Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihp946 kind= uid=e8642b49-93e4-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:40:49.376049Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidktv5 kind= uid=061f059b-93e5-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:40:49.377568Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidktv5 kind= uid=061f059b-93e5-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:40:49.551691Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidktv5\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidktv5" level=info timestamp=2018-07-30T10:40:49.569584Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidktv5\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidktv5" level=info timestamp=2018-07-30T10:41:24.959858Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibsp2s kind= uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-30T10:41:24.963553Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibsp2s kind= uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-30T10:41:25.112813Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmibsp2s\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmibsp2s" level=info timestamp=2018-07-30T10:41:25.133094Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmibsp2s\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmibsp2s" Pod name: virt-handler-9gqdg Pod phase: Running level=info timestamp=2018-07-30T10:23:07.692990Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind= uid=835f2d3f-93e2-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:24:06.190924Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind= uid=835f2d3f-93e2-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:24:06.192459Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind= uid=835f2d3f-93e2-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:24:06.200260Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind= uid=835f2d3f-93e2-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmi6hvxr" level=info timestamp=2018-07-30T10:24:06.292349Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind= uid=835f2d3f-93e2-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:24:06.423274Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind= uid=ef2f0528-93e1-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:24:06.424121Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind= uid=ef2f0528-93e1-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:24:06.425957Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind= uid=ef2f0528-93e1-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmiz2cnn" level=info timestamp=2018-07-30T10:24:06.521176Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T10:24:06.521269Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T10:24:06.522188Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6hvxr kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:24:06.915250Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind= uid=ef2f0528-93e1-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:24:06.915410Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-30T10:24:06.915451Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-30T10:24:06.927096Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiz2cnn kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-t48x9 Pod phase: Running level=info timestamp=2018-07-30T10:41:41.721471Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmibsp2s kind= uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:41:43.082519Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type ADDED" level=info timestamp=2018-07-30T10:41:43.082800Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmibsp2s kind=Domain uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-30T10:41:43.784868Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:41:43.785102Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmibsp2s kind=Domain uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-30T10:41:43.881640Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-30T10:41:43.883963Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmibsp2s kind= uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:41:43.884073Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmibsp2s kind= uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="No update processing required" level=info timestamp=2018-07-30T10:41:43.928587Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmibsp2s kind= uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:41:43.937001Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmibsp2s kind= uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-30T10:41:43.956566Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmibsp2s kind= uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-30T10:42:01.726525Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmibsp2s kind= uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-30T10:42:01.727001Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmibsp2s kind= uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-30T10:42:01.730255Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmibsp2s kind= uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmibsp2s" level=info timestamp=2018-07-30T10:42:01.802884Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmibsp2s kind= uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="Synchronization loop succeeded." 2018/07/30 06:44:02 read closing down: EOF • Failure [157.411 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With ephemeral alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:205 should not persist data [It] /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:218 Timed out after 120.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1065 ------------------------------ STEP: Starting the VirtualMachineInstance STEP: Starting a VirtualMachineInstance STEP: Waiting until the VirtualMachineInstance will start level=info timestamp=2018-07-30T10:41:25.553086Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmibsp2s kind=VirtualMachineInstance uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmibsp2s-q9mcx" level=info timestamp=2018-07-30T10:41:42.115644Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmibsp2s kind=VirtualMachineInstance uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmibsp2s-q9mcx" level=info timestamp=2018-07-30T10:41:44.309960Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmibsp2s kind=VirtualMachineInstance uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-30T10:41:44.340407Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmibsp2s kind=VirtualMachineInstance uid=1b54075c-93e5-11e8-97bf-525500d15501 msg="VirtualMachineInstance started." STEP: Writing an arbitrary file to it's EFI partition STEP: Killing a VirtualMachineInstance panic: test timed out after 1h30m0s goroutine 8173 [running]: testing.(*M).startAlarm.func1() /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1240 +0xfc created by time.goFunc /gimme/.gimme/versions/go1.10.linux.amd64/src/time/sleep.go:172 +0x44 goroutine 1 [chan receive, 90 minutes]: testing.(*T).Run(0xc420825860, 0x139eaef, 0x9, 0x1430f68, 0x4801e6) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:825 +0x301 testing.runTests.func1(0xc420825770) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1063 +0x64 testing.tRunner(0xc420825770, 0xc420525df8) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:777 +0xd0 testing.runTests(0xc4207425e0, 0x1d32a50, 0x1, 0x1, 0x412009) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1061 +0x2c4 testing.(*M).Run(0xc42056e180, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:978 +0x171 main.main() _testmain.go:44 +0x151 goroutine 20 [chan receive]: kubevirt.io/kubevirt/vendor/github.com/golang/glog.(*loggingT).flushDaemon(0x1d5e280) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/golang/glog/glog.go:879 +0x8b created by kubevirt.io/kubevirt/vendor/github.com/golang/glog.init.0 /root/go/src/kubevirt.io/kubevirt/vendor/github.com/golang/glog/glog.go:410 +0x203 goroutine 21 [syscall, 90 minutes]: os/signal.signal_recv(0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/sigqueue.go:139 +0xa6 os/signal.loop() /gimme/.gimme/versions/go1.10.linux.amd64/src/os/signal/signal_unix.go:22 +0x22 created by os/signal.init.0 /gimme/.gimme/versions/go1.10.linux.amd64/src/os/signal/signal_unix.go:28 +0x41 goroutine 11 [select]: kubevirt.io/kubevirt/vendor/github.com/onsi/gomega/internal/asyncassertion.(*AsyncAssertion).match(0xc420b3f980, 0x14c4ce0, 0x1d7c938, 0x412801, 0x0, 0x0, 0x0, 0x1d7c938) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/gomega/internal/asyncassertion/async_assertion.go:139 +0x2e6 kubevirt.io/kubevirt/vendor/github.com/onsi/gomega/internal/asyncassertion.(*AsyncAssertion).Should(0xc420b3f980, 0x14c4ce0, 0x1d7c938, 0x0, 0x0, 0x0, 0xc420b3f980) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/gomega/internal/asyncassertion/async_assertion.go:48 +0x62 kubevirt.io/kubevirt/tests.WaitForVirtualMachineToDisappearWithTimeout(0xc420d08280, 0x78) /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1065 +0x228 kubevirt.io/kubevirt/tests_test.glob..func7.3.5.3() /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:305 +0x365 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*runner).runSync(0xc420023740, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/runner.go:113 +0x9c kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*runner).run(0xc420023740, 0x3, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/runner.go:64 +0x13e kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*ItNode).Run(0xc4207436e0, 0x14b6f60, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/it_node.go:26 +0x7f kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec.(*Spec).runSample(0xc4204305a0, 0x0, 0x14b6f60, 0xc4200fb480) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec/spec.go:203 +0x648 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec.(*Spec).Run(0xc4204305a0, 0x14b6f60, 0xc4200fb480) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec/spec.go:138 +0xff kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).runSpec(0xc4203792c0, 0xc4204305a0, 0x0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:200 +0x10d kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).runSpecs(0xc4203792c0, 0x1) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:170 +0x329 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).Run(0xc4203792c0, 0xb) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:66 +0x11b kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/suite.(*Suite).Run(0xc4200f8af0, 0x7f18cb978ac8, 0xc420825860, 0x13a10d2, 0xb, 0xc420742820, 0x2, 0x2, 0x14d3820, 0xc4200fb480, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/suite/suite.go:62 +0x27c kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo.RunSpecsWithCustomReporters(0x14b7fc0, 0xc420825860, 0x13a10d2, 0xb, 0xc420742740, 0x2, 0x2, 0x2) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/ginkgo_dsl.go:221 +0x258 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo.RunSpecsWithDefaultAndCustomReporters(0x14b7fc0, 0xc420825860, 0x13a10d2, 0xb, 0xc4204db270, 0x1, 0x1, 0x1) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/ginkgo_dsl.go:209 +0xab kubevirt.io/kubevirt/tests_test.TestTests(0xc420825860) /root/go/src/kubevirt.io/kubevirt/tests/tests_suite_test.go:43 +0xaa testing.tRunner(0xc420825860, 0x1430f68) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:777 +0xd0 created by testing.(*T).Run /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:824 +0x2e0 goroutine 12 [chan receive, 90 minutes]: kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).registerForInterrupts(0xc4203792c0, 0xc4209c59e0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:223 +0xd1 created by kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).Run /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:60 +0x88 goroutine 13 [select, 90 minutes, locked to thread]: runtime.gopark(0x1433140, 0x0, 0x139b611, 0x6, 0x18, 0x1) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/proc.go:291 +0x11a runtime.selectgo(0xc42047b750, 0xc4209c5aa0) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/select.go:392 +0xe50 runtime.ensureSigM.func1() /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/signal_unix.go:549 +0x1f4 runtime.goexit() /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/asm_amd64.s:2361 +0x1 goroutine 16 [IO wait]: internal/poll.runtime_pollWait(0x7f18cba04f00, 0x72, 0xc42109f850) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/netpoll.go:173 +0x57 internal/poll.(*pollDesc).wait(0xc420696318, 0x72, 0xffffffffffffff00, 0x14b9180, 0x1c497d0) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_poll_runtime.go:85 +0x9b internal/poll.(*pollDesc).waitRead(0xc420696318, 0xc420e3a000, 0x8000, 0x8000) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_poll_runtime.go:90 +0x3d internal/poll.(*FD).Read(0xc420696300, 0xc420e3a000, 0x8000, 0x8000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_unix.go:157 +0x17d net.(*netFD).Read(0xc420696300, 0xc420e3a000, 0x8000, 0x8000, 0x0, 0x8, 0x7ffb) /gimme/.gimme/versions/go1.10.linux.amd64/src/net/fd_unix.go:202 +0x4f net.(*conn).Read(0xc4200f67e0, 0xc420e3a000, 0x8000, 0x8000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/net/net.go:176 +0x6a crypto/tls.(*block).readFromUntil(0xc4205416b0, 0x7f18cb978be0, 0xc4200f67e0, 0x5, 0xc4200f67e0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:493 +0x96 crypto/tls.(*Conn).readRecord(0xc420413180, 0x1433217, 0xc4204132a0, 0x20) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:595 +0xe0 crypto/tls.(*Conn).Read(0xc420413180, 0xc4205fe000, 0x1000, 0x1000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:1156 +0x100 bufio.(*Reader).Read(0xc42015d1a0, 0xc4205329d8, 0x9, 0x9, 0xc420f3fc78, 0xc4205047a0, 0xc42109fd10) /gimme/.gimme/versions/go1.10.linux.amd64/src/bufio/bufio.go:216 +0x238 io.ReadAtLeast(0x14b5d60, 0xc42015d1a0, 0xc4205329d8, 0x9, 0x9, 0x9, 0xc42109fce0, 0xc42109fce0, 0x406614) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/io.go:309 +0x86 io.ReadFull(0x14b5d60, 0xc42015d1a0, 0xc4205329d8, 0x9, 0x9, 0xc420f3fc20, 0xc42109fd10, 0xc400002e01) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/io.go:327 +0x58 kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.readFrameHeader(0xc4205329d8, 0x9, 0x9, 0x14b5d60, 0xc42015d1a0, 0x0, 0xc400000000, 0x7efa2d, 0xc42109ffb0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/frame.go:237 +0x7b kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*Framer).ReadFrame(0xc4205329a0, 0xc4207e9500, 0x0, 0x0, 0x0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/frame.go:492 +0xa4 kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*clientConnReadLoop).run(0xc42109ffb0, 0x1431ec0, 0xc42047bfb0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:1428 +0x8e kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*ClientConn).readLoop(0xc4202acea0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:1354 +0x76 created by kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*Transport).newClientConn /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:579 +0x651 goroutine 4694 [chan send, 35 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc4207f77d0) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 7171 [chan receive, 14 minutes]: kubevirt.io/kubevirt/pkg/kubecli.(*asyncWSRoundTripper).WebsocketCallback(0xc420be0110, 0xc420d83cc0, 0xc42016b170, 0x0, 0x0, 0x18, 0xc421079ec8) /root/go/src/kubevirt.io/kubevirt/pkg/kubecli/vmi.go:163 +0x32b kubevirt.io/kubevirt/pkg/kubecli.(*asyncWSRoundTripper).WebsocketCallback-fm(0xc420d83cc0, 0xc42016b170, 0x0, 0x0, 0xc420d83cc0, 0xc42016b170) /root/go/src/kubevirt.io/kubevirt/pkg/kubecli/vmi.go:313 +0x52 kubevirt.io/kubevirt/pkg/kubecli.(*WebsocketRoundTripper).RoundTrip(0xc420be05d0, 0xc420d8ef00, 0x0, 0x0, 0x0) /root/go/src/kubevirt.io/kubevirt/pkg/kubecli/vmi.go:142 +0xab kubevirt.io/kubevirt/pkg/kubecli.(*vmis).asyncSubresourceHelper.func1(0x14b6240, 0xc420be05d0, 0xc420d8ef00, 0xc42064fb00) /root/go/src/kubevirt.io/kubevirt/pkg/kubecli/vmi.go:328 +0x56 created by kubevirt.io/kubevirt/pkg/kubecli.(*vmis).asyncSubresourceHelper /root/go/src/kubevirt.io/kubevirt/pkg/kubecli/vmi.go:326 +0x33a goroutine 5940 [chan send, 26 minutes]: kubevirt.io/kubevirt/tests_test.glob..func23.1.2.1.1(0x14f1520, 0xc4209d1140, 0xc4200f60a8, 0xc4209c4c00, 0xc4204f4b78, 0xc4204f4b88) /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:81 +0x138 created by kubevirt.io/kubevirt/tests_test.glob..func23.1.2.1 /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:73 +0x386 make: *** [functest] Error 2 + make cluster-down ./cluster/down.sh