+ export WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release + WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release + [[ k8s-1.10.3-release =~ openshift-.* ]] + [[ k8s-1.10.3-release =~ .*-1.9.3-.* ]] + export KUBEVIRT_PROVIDER=k8s-1.10.3 + KUBEVIRT_PROVIDER=k8s-1.10.3 + export KUBEVIRT_NUM_NODES=2 + KUBEVIRT_NUM_NODES=2 + export NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + export NAMESPACE=kube-system + NAMESPACE=kube-system + trap '{ make cluster-down; }' EXIT SIGINT SIGTERM SIGSTOP + make cluster-down ./cluster/down.sh + make cluster-up ./cluster/up.sh Downloading ....... Downloading ....... 2018/07/23 17:57:50 Waiting for host: 192.168.66.101:22 2018/07/23 17:57:53 Problem with dial: dial tcp 192.168.66.101:22: getsockopt: no route to host. Sleeping 5s 2018/07/23 17:58:05 Connected to tcp://192.168.66.101:22 + kubeadm init --config /etc/kubernetes/kubeadm.conf [init] Using Kubernetes version: v1.10.3 [init] Using Authorization modes: [Node RBAC] [preflight] Running pre-flight checks. [WARNING FileExisting-crictl]: crictl not found in system path Suggestion: go get github.com/kubernetes-incubator/cri-tools/cmd/crictl Flag --admission-control has been deprecated, Use --enable-admission-plugins or --disable-admission-plugins instead. Will be removed in a future version. [certificates] Generated ca certificate and key. [certificates] Generated apiserver certificate and key. [certificates] apiserver serving cert is signed for DNS names [node01 kubernetes kubernetes.default kubernetes.default.svc kubernetes.default.svc.cluster.local] and IPs [10.96.0.1 192.168.66.101] [certificates] Generated apiserver-kubelet-client certificate and key. [certificates] Generated etcd/ca certificate and key. [certificates] Generated etcd/server certificate and key. [certificates] etcd/server serving cert is signed for DNS names [localhost] and IPs [127.0.0.1] [certificates] Generated etcd/peer certificate and key. [certificates] etcd/peer serving cert is signed for DNS names [node01] and IPs [192.168.66.101] [certificates] Generated etcd/healthcheck-client certificate and key. [certificates] Generated apiserver-etcd-client certificate and key. [certificates] Generated sa key and public key. [certificates] Generated front-proxy-ca certificate and key. [certificates] Generated front-proxy-client certificate and key. [certificates] Valid certificates and keys now exist in "/etc/kubernetes/pki" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/admin.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/kubelet.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/controller-manager.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/scheduler.conf" [controlplane] Wrote Static Pod manifest for component kube-apiserver to "/etc/kubernetes/manifests/kube-apiserver.yaml" [controlplane] Wrote Static Pod manifest for component kube-controller-manager to "/etc/kubernetes/manifests/kube-controller-manager.yaml" [controlplane] Wrote Static Pod manifest for component kube-scheduler to "/etc/kubernetes/manifests/kube-scheduler.yaml" [etcd] Wrote Static Pod manifest for a local etcd instance to "/etc/kubernetes/manifests/etcd.yaml" [init] Waiting for the kubelet to boot up the control plane as Static Pods from directory "/etc/kubernetes/manifests". [init] This might take a minute or longer if the control plane images have to be pulled. [apiclient] All control plane components are healthy after 30.008832 seconds [uploadconfig] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace [markmaster] Will mark node node01 as master by adding a label and a taint [markmaster] Master node01 tainted and labelled with key/value: node-role.kubernetes.io/master="" [bootstraptoken] Using token: abcdef.1234567890123456 [bootstraptoken] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials [bootstraptoken] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token [bootstraptoken] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster [bootstraptoken] Creating the "cluster-info" ConfigMap in the "kube-public" namespace [addons] Applied essential addon: kube-dns [addons] Applied essential addon: kube-proxy Your Kubernetes master has initialized successfully! To start using your cluster, you need to run the following as a regular user: mkdir -p $HOME/.kube sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config sudo chown $(id -u):$(id -g) $HOME/.kube/config You should now deploy a pod network to the cluster. Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at: https://kubernetes.io/docs/concepts/cluster-administration/addons/ You can now join any number of machines by running the following on each node as root: kubeadm join 192.168.66.101:6443 --token abcdef.1234567890123456 --discovery-token-ca-cert-hash sha256:b6960d6284cdcfed0fbebd0c592e262d23b25365f66e99c3a92f3ccf43f3f73b + kubectl --kubeconfig=/etc/kubernetes/admin.conf apply -f https://raw.githubusercontent.com/coreos/flannel/v0.9.1/Documentation/kube-flannel.yml clusterrole.rbac.authorization.k8s.io "flannel" created clusterrolebinding.rbac.authorization.k8s.io "flannel" created serviceaccount "flannel" created configmap "kube-flannel-cfg" created daemonset.extensions "kube-flannel-ds" created + kubectl --kubeconfig=/etc/kubernetes/admin.conf taint nodes node01 node-role.kubernetes.io/master:NoSchedule- node "node01" untainted 2018/07/23 17:58:49 Waiting for host: 192.168.66.102:22 2018/07/23 17:58:52 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: no route to host. Sleeping 5s 2018/07/23 17:59:00 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: no route to host. Sleeping 5s 2018/07/23 17:59:05 Connected to tcp://192.168.66.102:22 + kubeadm join --token abcdef.1234567890123456 192.168.66.101:6443 --ignore-preflight-errors=all --discovery-token-unsafe-skip-ca-verification=true [preflight] Running pre-flight checks. [WARNING FileExisting-crictl]: crictl not found in system path Suggestion: go get github.com/kubernetes-incubator/cri-tools/cmd/crictl [discovery] Trying to connect to API Server "192.168.66.101:6443" [discovery] Created cluster-info discovery client, requesting info from "https://192.168.66.101:6443" [discovery] Cluster info signature and contents are valid and no TLS pinning was specified, will use API Server "192.168.66.101:6443" [discovery] Successfully established connection with API Server "192.168.66.101:6443" This node has joined the cluster: * Certificate signing request was sent to master and a response was received. * The Kubelet was informed of the new secure connection details. Run 'kubectl get nodes' on the master to see this node join the cluster. Sending file modes: C0755 39588992 kubectl Sending file modes: C0600 5450 admin.conf Cluster "kubernetes" set. Cluster "kubernetes" set. + set +e + kubectl get nodes --no-headers + cluster/kubectl.sh get nodes --no-headers node01 Ready master 47s v1.10.3 node02 Ready 18s v1.10.3 + kubectl_rc=0 + '[' 0 -ne 0 ']' ++ kubectl get nodes --no-headers ++ cluster/kubectl.sh get nodes --no-headers ++ grep NotReady + '[' -n '' ']' + set -e + echo 'Nodes are ready:' Nodes are ready: + kubectl get nodes + cluster/kubectl.sh get nodes NAME STATUS ROLES AGE VERSION node01 Ready master 49s v1.10.3 node02 Ready 20s v1.10.3 + make cluster-sync ./cluster/build.sh Building ... Untagged: localhost:34620/kubevirt/virt-controller:devel Untagged: localhost:34620/kubevirt/virt-controller@sha256:fb779157baef260e1153e5dfab7adaaa291b1c6e8bdbf9cb10b8eac98e20bad1 Deleted: sha256:18aff960359e020764ed0966f6c5a53186f2112845005937246819ba3f4cf721 Deleted: sha256:70d6cf6ac31f44a694e722892e0ab6a121ef72088ce53247340f97e76f9d8f74 Deleted: sha256:d582ae63719b07a9769bf6f5f770e0203d72bda9fbad9d4c8230030f631613a5 Deleted: sha256:d98325ca91996ced9def62ff87d1ec4bc1d3315de0b7832da05c3526da318615 Untagged: localhost:34620/kubevirt/virt-launcher:devel Untagged: localhost:34620/kubevirt/virt-launcher@sha256:f990cbfaebcde77a08de57cd0addebecdd31962e0dcc403ed267a60854607aa4 Deleted: sha256:5c0235943cc2dac829675e27bfe109643190d8ab7091f3845caf8413eea8c0a6 Deleted: sha256:b68c74377e0f3ba0cbf8680b5300a3576f62ad00523e729ec431903ca607d3d9 Deleted: sha256:60ffd5b0068586ca82c454cc7804e98f52e7279a9717d5fc2eca3ebcaa4a316c Deleted: sha256:a60e05839ca3c307f5c5ccc4913cbfbc760e250ba90e2bf32909fab2b2ee650a Deleted: sha256:45d220ce7ece34ea7b34934e7fe2d35ed626b545e36b8a38fe1ae9909d848f68 Deleted: sha256:6d52be8dc46221642bf4bae86096d088cc6591dba2af3198cd93ac9452299688 Deleted: sha256:104d141b02b8e4cb064a6e22d0e41c3e4d4d9ba9cf70cf859704f91b2abb4777 Deleted: sha256:32d08d68a3423a4aa931774fdceb216d6b14fd0f05fed20dd29f425376e6d0aa Deleted: sha256:2013d264ee2cb03bd7fb51515e5726edbf2c8c572f0d00b04724e53e39fb2f2e Deleted: sha256:bcad762e8a779ca6dbf7b5218dfde062a660e336b4268e258782cb83bedf1b40 Deleted: sha256:8b4fbf665b0a0cd1bd23602cf2f90d0b6a66f1c3ed18e55da4fadc48b8ff8494 Deleted: sha256:d12deae53de2fd904d99ddaa13145943e1d933adc05e720b965caf0bec44c313 Untagged: localhost:34620/kubevirt/virt-handler:devel Untagged: localhost:34620/kubevirt/virt-handler@sha256:48baa6552b55090e3243b266a9a6f2dc70acbc0ca9937cd9150c149435a7c310 Deleted: sha256:aab1f63492032ec234838895e64a7f682c2cd589aa67ec463aae5f5115857a92 Deleted: sha256:a05c14b03926466ac1ff4992dd7c4dcc92930a2a02c49b8f355c197a310db0b4 Deleted: sha256:1721ab778241b4492182720c31e47ad8a845068d485d718de1121fc9496b81bc Deleted: sha256:a63b8ac6bf5850253b35ca56117a5bbd897090073385e8b74bb1b51cd5285e75 Untagged: localhost:34620/kubevirt/virt-api:devel Untagged: localhost:34620/kubevirt/virt-api@sha256:ea4f091fa829105040c5e441689a3802d34b7e3ae2032ababf06491d8082e1e3 Deleted: sha256:e45d8956f92ef4ec1eb6e72c31fe259d86f92338ed2fd03c5c554b2e0e748d8b Deleted: sha256:4bd15ba95467615d6784f43e5c658800c7f7b367a17f0d0fe4ee8b8e89f6c524 Deleted: sha256:9ad49415df4d0fb49e56bc53e5bc60e1262164578fb2378183ba60bfe011a3b9 Deleted: sha256:5042b4cbcd61f7d6a490ad3c7656d524e944c5148df01f3de9a8a315631c3f8e Untagged: localhost:34620/kubevirt/subresource-access-test:devel Untagged: localhost:34620/kubevirt/subresource-access-test@sha256:eb59f8932170291246229efefcb05025068e1235cf32439e71219d2d8f05481e Deleted: sha256:e0d4fe2855778247d9ce2726f292b09dfd3c02efa72889197328f56b6f6cbcad Deleted: sha256:2c3e18361a6d9611c747ff7f91b9bcbfb5e0219e8e6fb1dc240046e2c8e88ea6 Deleted: sha256:7df9591700c2d19a83b070e049e6201414e0586abad0004d04d1cdb74cba114e Deleted: sha256:874d25bb8ba9f37871dcbbb94ef7edbf1452366314229af1bf0e3b0df030c390 Untagged: localhost:34620/kubevirt/example-hook-sidecar:devel Untagged: localhost:34620/kubevirt/example-hook-sidecar@sha256:da514b8e9f732e430182c12ebb2c56e862731f0dfc9ebba218d111583d8c0ffc Deleted: sha256:d95e2aa20f50874475fddf31e4533739b024d06509def543be4b0313a4f7067e Deleted: sha256:f18376841759947a1dbc5e162e24e50ffde35989b2e2ea6bcfa97f5ca5fb06b1 Deleted: sha256:daa2b0297edffceee4b5f28774bcea7d2d74081c972acba2fef832ded0364c26 Deleted: sha256:4a1c3845e9f76027210c059f905a7bb1f40607cf04149c487b1ba4d3a29ab89b sha256:7fb8539d32771bf74786d31102b8c102fc61586b172276b4710c6944077751f4 go version go1.10 linux/amd64 go version go1.10 linux/amd64 make[1]: Entering directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt' hack/dockerized "./hack/check.sh && KUBEVIRT_VERSION= ./hack/build-go.sh install " && ./hack/build-copy-artifacts.sh sha256:7fb8539d32771bf74786d31102b8c102fc61586b172276b4710c6944077751f4 go version go1.10 linux/amd64 go version go1.10 linux/amd64 find: '/root/go/src/kubevirt.io/kubevirt/_out/cmd': No such file or directory Compiling tests... compiled tests.test hack/build-docker.sh build Sending build context to Docker daemon 40.35 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 82fe13c41cb7 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-controller ---> Using cache ---> e9589b9dbfb3 Step 4/8 : WORKDIR /home/virt-controller ---> Using cache ---> 6526953b7273 Step 5/8 : USER 1001 ---> Using cache ---> 0da81e671cc6 Step 6/8 : COPY virt-controller /usr/bin/virt-controller ---> Using cache ---> dc1dce2a22bb Step 7/8 : ENTRYPOINT /usr/bin/virt-controller ---> Using cache ---> 64bf389ede85 Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.10.3-release0" '' "virt-controller" '' ---> Running in 5fc549edd66e ---> 130f69141e77 Removing intermediate container 5fc549edd66e Successfully built 130f69141e77 Sending build context to Docker daemon 42.63 MB Step 1/10 : FROM kubevirt/libvirt:4.2.0 ---> 5f0bfe81a3e0 Step 2/10 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 8826ac178c51 Step 3/10 : RUN dnf -y install socat genisoimage util-linux libcgroup-tools ethtool net-tools sudo && dnf -y clean all && test $(id -u qemu) = 107 # make sure that the qemu user really is 107 ---> Using cache ---> 5eb474bfa821 Step 4/10 : COPY virt-launcher /usr/bin/virt-launcher ---> Using cache ---> 3eef8da590bb Step 5/10 : COPY kubevirt-sudo /etc/sudoers.d/kubevirt ---> Using cache ---> db7d9f778df7 Step 6/10 : RUN setcap CAP_NET_BIND_SERVICE=+eip /usr/bin/qemu-system-x86_64 ---> Using cache ---> c3622d9043ef Step 7/10 : RUN mkdir -p /usr/share/kubevirt/virt-launcher ---> Running in 655e6d0f9688  ---> f8f482e37246 Removing intermediate container 655e6d0f9688 Step 8/10 : COPY entrypoint.sh libvirtd.sh sock-connector /usr/share/kubevirt/virt-launcher/ ---> 1077d278659e Removing intermediate container 27547cf8f40e Step 9/10 : ENTRYPOINT /usr/share/kubevirt/virt-launcher/entrypoint.sh ---> Running in a00e819c89db ---> 78ac050eb84b Removing intermediate container a00e819c89db Step 10/10 : LABEL "kubevirt-functional-tests-k8s-1.10.3-release0" '' "virt-launcher" '' ---> Running in f59e56c100c0 ---> 38a4c6e286c4 Removing intermediate container f59e56c100c0 Successfully built 38a4c6e286c4 Sending build context to Docker daemon 41.66 MB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 82fe13c41cb7 Step 3/5 : COPY virt-handler /usr/bin/virt-handler ---> c0232831d5fa Removing intermediate container 0f9443c7c87b Step 4/5 : ENTRYPOINT /usr/bin/virt-handler ---> Running in 4cc24dc1e7f7 ---> 95c8b7f98274 Removing intermediate container 4cc24dc1e7f7 Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.10.3-release0" '' "virt-handler" '' ---> Running in 1dec300cd58f ---> 2e1f0ff02201 Removing intermediate container 1dec300cd58f Successfully built 2e1f0ff02201 Sending build context to Docker daemon 38.75 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 82fe13c41cb7 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-api ---> Using cache ---> 1a58ff1483fa Step 4/8 : WORKDIR /home/virt-api ---> Using cache ---> 87e30c5b4065 Step 5/8 : USER 1001 ---> Using cache ---> e889af541bd0 Step 6/8 : COPY virt-api /usr/bin/virt-api ---> Using cache ---> d648b895342d Step 7/8 : ENTRYPOINT /usr/bin/virt-api ---> Running in 7b345eda3602 ---> c6669a596374 Removing intermediate container 7b345eda3602 Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.10.3-release0" '' "virt-api" '' ---> Running in fea6dbbf9a98 ---> 1fb981d29b06 Removing intermediate container fea6dbbf9a98 Successfully built 1fb981d29b06 Sending build context to Docker daemon 4.096 kB Step 1/7 : FROM fedora:28 ---> cc510acfcd70 Step 2/7 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 82fe13c41cb7 Step 3/7 : ENV container docker ---> Using cache ---> 6e6b2ef85e92 Step 4/7 : RUN mkdir -p /images/custom /images/alpine && truncate -s 64M /images/custom/disk.img && curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /images/alpine/disk.img ---> Using cache ---> 8e1d737ded1f Step 5/7 : ADD entrypoint.sh / ---> Using cache ---> 104e48aa676f Step 6/7 : CMD /entrypoint.sh ---> Using cache ---> 4ed9f69e6653 Step 7/7 : LABEL "disks-images-provider" '' "kubevirt-functional-tests-k8s-1.10.3-release0" '' ---> Using cache ---> 694cf1afe619 Successfully built 694cf1afe619 Sending build context to Docker daemon 2.56 kB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 82fe13c41cb7 Step 3/5 : ENV container docker ---> Using cache ---> 6e6b2ef85e92 Step 4/5 : RUN dnf -y install procps-ng nmap-ncat && dnf -y clean all ---> Using cache ---> d130857891a9 Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.10.3-release0" '' "vm-killer" '' ---> Using cache ---> 0b1469b868f8 Successfully built 0b1469b868f8 Sending build context to Docker daemon 5.12 kB Step 1/7 : FROM debian:sid ---> 496290160351 Step 2/7 : MAINTAINER "David Vossel" \ ---> Using cache ---> 3b36b527fef8 Step 3/7 : ENV container docker ---> Using cache ---> b3ada414d649 Step 4/7 : RUN apt-get update && apt-get install -y bash curl bzip2 qemu-utils && mkdir -p /disk && rm -rf /var/lib/apt/lists/* ---> Using cache ---> 337be6171fcb Step 5/7 : ADD entry-point.sh / ---> Using cache ---> a98a961fa5a1 Step 6/7 : CMD /entry-point.sh ---> Using cache ---> 19baf5d1aab8 Step 7/7 : LABEL "kubevirt-functional-tests-k8s-1.10.3-release0" '' "registry-disk-v1alpha" '' ---> Using cache ---> caee040db85c Successfully built caee040db85c Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:35003/kubevirt/registry-disk-v1alpha:devel ---> caee040db85c Step 2/4 : MAINTAINER "David Vossel" \ ---> Using cache ---> 453ad127b9bc Step 3/4 : RUN curl https://download.cirros-cloud.net/0.4.0/cirros-0.4.0-x86_64-disk.img > /disk/cirros.img ---> Using cache ---> 28cedfe7d642 Step 4/4 : LABEL "cirros-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.10.3-release0" '' ---> Using cache ---> b0932ddb9d63 Successfully built b0932ddb9d63 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:35003/kubevirt/registry-disk-v1alpha:devel ---> caee040db85c Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> b591880b7a09 Step 3/4 : RUN curl -g -L https://download.fedoraproject.org/pub/fedora/linux/releases/27/CloudImages/x86_64/images/Fedora-Cloud-Base-27-1.6.x86_64.qcow2 > /disk/fedora.qcow2 ---> Using cache ---> 9848462e6b89 Step 4/4 : LABEL "fedora-cloud-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.10.3-release0" '' ---> Using cache ---> 901ffca67a01 Successfully built 901ffca67a01 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:35003/kubevirt/registry-disk-v1alpha:devel ---> caee040db85c Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> b591880b7a09 Step 3/4 : RUN curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /disk/alpine.iso ---> Using cache ---> 4089bb58f7c7 Step 4/4 : LABEL "alpine-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.10.3-release0" '' ---> Using cache ---> 13c6fa3b7f0d Successfully built 13c6fa3b7f0d Sending build context to Docker daemon 35.56 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 82fe13c41cb7 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virtctl ---> Using cache ---> f9cd90a6a0ef Step 4/8 : WORKDIR /home/virtctl ---> Using cache ---> df6f2d83c1d6 Step 5/8 : USER 1001 ---> Using cache ---> 56a7b7e6b8ff Step 6/8 : COPY subresource-access-test /subresource-access-test ---> dd68f917591a Removing intermediate container a3055e8b241a Step 7/8 : ENTRYPOINT /subresource-access-test ---> Running in b5eee8ef9441 ---> d9193851c9b0 Removing intermediate container b5eee8ef9441 Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.10.3-release0" '' "subresource-access-test" '' ---> Running in 04821c2cb33b ---> 7b7653f8dcb0 Removing intermediate container 04821c2cb33b Successfully built 7b7653f8dcb0 Sending build context to Docker daemon 3.072 kB Step 1/9 : FROM fedora:28 ---> cc510acfcd70 Step 2/9 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 82fe13c41cb7 Step 3/9 : ENV container docker ---> Using cache ---> 6e6b2ef85e92 Step 4/9 : RUN dnf -y install make git gcc && dnf -y clean all ---> Using cache ---> c1e9e769c4ba Step 5/9 : ENV GIMME_GO_VERSION 1.9.2 ---> Using cache ---> 6729c465203a Step 6/9 : RUN mkdir -p /gimme && curl -sL https://raw.githubusercontent.com/travis-ci/gimme/master/gimme | HOME=/gimme bash >> /etc/profile.d/gimme.sh ---> Using cache ---> 2aee087083e8 Step 7/9 : ENV GOPATH "/go" GOBIN "/usr/bin" ---> Using cache ---> e3795172dd73 Step 8/9 : RUN mkdir -p /go && source /etc/profile.d/gimme.sh && go get github.com/masterzen/winrm-cli ---> Using cache ---> 0de2fc4b917f Step 9/9 : LABEL "kubevirt-functional-tests-k8s-1.10.3-release0" '' "winrmcli" '' ---> Using cache ---> e7206f6d248e Successfully built e7206f6d248e Sending build context to Docker daemon 36.77 MB Step 1/5 : FROM fedora:27 ---> 9110ae7f579f Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> b730b4ed65df Step 3/5 : COPY example-hook-sidecar /example-hook-sidecar ---> 1737876f8594 Removing intermediate container fdcf8c5fa259 Step 4/5 : ENTRYPOINT /example-hook-sidecar ---> Running in 3fd7d8cd9d06 ---> f16ed7f144e8 Removing intermediate container 3fd7d8cd9d06 Step 5/5 : LABEL "example-hook-sidecar" '' "kubevirt-functional-tests-k8s-1.10.3-release0" '' ---> Running in 04d04404159c ---> 618cbd1256d9 Removing intermediate container 04d04404159c Successfully built 618cbd1256d9 hack/build-docker.sh push The push refers to a repository [localhost:35003/kubevirt/virt-controller] 312e559ae1eb: Preparing ff9b9e61b9df: Preparing 891e1e4ef82a: Preparing ff9b9e61b9df: Pushed 312e559ae1eb: Pushed 891e1e4ef82a: Pushed devel: digest: sha256:a3ebf63380b3a534ea3660c8b8f721aeaeb2ffc39671eee4acf3bc24bf5801da size: 949 The push refers to a repository [localhost:35003/kubevirt/virt-launcher] 6320ab00ea1a: Preparing 1590f682429d: Preparing 54a84e880d70: Preparing a78087daffd6: Preparing b8bd090ce3c4: Preparing cfcba35fba84: Preparing da38cf808aa5: Preparing b83399358a92: Preparing 186d8b3e4fd8: Preparing fa6154170bf5: Preparing 5eefb9960a36: Preparing 891e1e4ef82a: Preparing 186d8b3e4fd8: Waiting fa6154170bf5: Waiting 5eefb9960a36: Waiting da38cf808aa5: Waiting 891e1e4ef82a: Waiting b83399358a92: Waiting a78087daffd6: Pushed 1590f682429d: Pushed 6320ab00ea1a: Pushed b83399358a92: Pushed da38cf808aa5: Pushed fa6154170bf5: Pushed 186d8b3e4fd8: Pushed 891e1e4ef82a: Mounted from kubevirt/virt-controller 54a84e880d70: Pushed cfcba35fba84: Pushed b8bd090ce3c4: Pushed 5eefb9960a36: Pushed devel: digest: sha256:c04df7beff5434575a572ddfb887a8d45d8acac8a296fbf55eec2b74a5e96fe2 size: 2828 The push refers to a repository [localhost:35003/kubevirt/virt-handler] 17ccc612212a: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-launcher 17ccc612212a: Pushed devel: digest: sha256:0e4f7b0607ea99dea54621e99f6ef66be8c8a20083995091593ee73506b54d9b size: 741 The push refers to a repository [localhost:35003/kubevirt/virt-api] 619cbbc5bbf6: Preparing 5f1414e2d326: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-handler 5f1414e2d326: Pushed 619cbbc5bbf6: Pushed devel: digest: sha256:40624dc90e5cbc9d1b54578bac6eca23857d5d4d3e96dc27dc471293f5b76051 size: 948 The push refers to a repository [localhost:35003/kubevirt/disks-images-provider] 2e0da09ca39e: Preparing 4fe8becbb60f: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-api 2e0da09ca39e: Pushed 4fe8becbb60f: Pushed devel: digest: sha256:e0b18a1418fcb677c9d502ac2e553797a3700a6e6bff9e412f5f919faafc8570 size: 948 The push refers to a repository [localhost:35003/kubevirt/vm-killer] 7b031fa3032f: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/disks-images-provider 7b031fa3032f: Pushed devel: digest: sha256:d4f193309ca1f14b26eb7590e6627c2619a1cbbcde883d730eca499cd8a5e974 size: 740 The push refers to a repository [localhost:35003/kubevirt/registry-disk-v1alpha] bfd12fa374fa: Preparing 18ac8ad2aee9: Preparing 132d61a890c5: Preparing bfd12fa374fa: Pushed 18ac8ad2aee9: Pushed 132d61a890c5: Pushed devel: digest: sha256:e05fe4ee65fe45f48a25a6a776be3dfbc494db81e799404ddc948cbc4622508f size: 948 The push refers to a repository [localhost:35003/kubevirt/cirros-registry-disk-demo] 0ee9a1ddc5f8: Preparing bfd12fa374fa: Preparing 18ac8ad2aee9: Preparing 132d61a890c5: Preparing 132d61a890c5: Mounted from kubevirt/registry-disk-v1alpha 18ac8ad2aee9: Mounted from kubevirt/registry-disk-v1alpha bfd12fa374fa: Mounted from kubevirt/registry-disk-v1alpha 0ee9a1ddc5f8: Pushed devel: digest: sha256:878520df112ee515b2aa937a9c9f761a18a56bfa473dac1bbcb1b0f67a7ed9d2 size: 1160 The push refers to a repository [localhost:35003/kubevirt/fedora-cloud-registry-disk-demo] e52196ed8281: Preparing bfd12fa374fa: Preparing 18ac8ad2aee9: Preparing 132d61a890c5: Preparing bfd12fa374fa: Mounted from kubevirt/cirros-registry-disk-demo 18ac8ad2aee9: Mounted from kubevirt/cirros-registry-disk-demo 132d61a890c5: Mounted from kubevirt/cirros-registry-disk-demo e52196ed8281: Pushed devel: digest: sha256:69cdde0237ca83ba2a1b0c7bef9a3d02f76bbcee3c094ecec74b1c2627de8a89 size: 1161 The push refers to a repository [localhost:35003/kubevirt/alpine-registry-disk-demo] 800a7891dfaa: Preparing bfd12fa374fa: Preparing 18ac8ad2aee9: Preparing 132d61a890c5: Preparing bfd12fa374fa: Mounted from kubevirt/fedora-cloud-registry-disk-demo 132d61a890c5: Mounted from kubevirt/fedora-cloud-registry-disk-demo 18ac8ad2aee9: Mounted from kubevirt/fedora-cloud-registry-disk-demo 800a7891dfaa: Pushed devel: digest: sha256:78f82db96351724e728d99ed26dc4d267a05e809f3a6bd8a51b5491ec1e73d93 size: 1160 The push refers to a repository [localhost:35003/kubevirt/subresource-access-test] 7b5cfa283d7b: Preparing 3c1237181850: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/vm-killer 3c1237181850: Pushed 7b5cfa283d7b: Pushed devel: digest: sha256:83c9faa22101f31b04053471561447be84012154256a62049abf7e53b0cc309e size: 948 The push refers to a repository [localhost:35003/kubevirt/winrmcli] bf2bff760365: Preparing 589098974698: Preparing 6e22155a44ef: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/subresource-access-test bf2bff760365: Pushed 6e22155a44ef: Pushed 589098974698: Pushed devel: digest: sha256:045705e183ba36d0c98ca8f5a9d5b7486f44fbae7e66b6a449088f22d7865f6f size: 1165 The push refers to a repository [localhost:35003/kubevirt/example-hook-sidecar] a8523db618ef: Preparing 39bae602f753: Preparing a8523db618ef: Pushed 39bae602f753: Pushed devel: digest: sha256:5773fcb6e17c0603a69e3ebb60218255fd9c51d76649590dbda0e620f11d58ac size: 740 make[1]: Leaving directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt' Done ./cluster/clean.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.10.3-release ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.10.3-release0 ++ job_prefix=kubevirt-functional-tests-k8s-1.10.3-release0 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-99-g9e88a00 ++ KUBEVIRT_VERSION=v0.7.0-99-g9e88a00 + source cluster/k8s-1.10.3/provider.sh ++ set -e ++ image=k8s-1.10.3@sha256:d6290260e7e6b84419984f12719cf592ccbe327373b8df76aa0481f8ec01d357 ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ source hack/config-default.sh source hack/config-k8s-1.10.3.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.10.3.sh ++ source hack/config-provider-k8s-1.10.3.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.3/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.3/.kubectl +++ docker_prefix=localhost:35003/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Cleaning up ...' Cleaning up ... + cluster/kubectl.sh get vmis --all-namespaces -o=custom-columns=NAME:.metadata.name,NAMESPACE:.metadata.namespace,FINALIZERS:.metadata.finalizers --no-headers + grep foregroundDeleteVirtualMachine + read p error: the server doesn't have a resource type "vmis" + _kubectl delete ds -l kubevirt.io -n kube-system --cascade=false --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=libvirt --force --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=virt-handler --force --grace-period 0 No resources found + namespaces=(default ${namespace}) + for i in '${namespaces[@]}' + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete deployment -l kubevirt.io No resources found + _kubectl -n default delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete rs -l kubevirt.io No resources found + _kubectl -n default delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete services -l kubevirt.io No resources found + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n default delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete secrets -l kubevirt.io No resources found + _kubectl -n default delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete pv -l kubevirt.io No resources found + _kubectl -n default delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete pvc -l kubevirt.io No resources found + _kubectl -n default delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete ds -l kubevirt.io No resources found + _kubectl -n default delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n default delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete pods -l kubevirt.io No resources found + _kubectl -n default delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n default delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete rolebinding -l kubevirt.io No resources found + _kubectl -n default delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete roles -l kubevirt.io No resources found + _kubectl -n default delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete clusterroles -l kubevirt.io No resources found + _kubectl -n default delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n default get crd offlinevirtualmachines.kubevirt.io ++ export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig ++ cluster/k8s-1.10.3/.kubectl -n default get crd offlinevirtualmachines.kubevirt.io ++ wc -l Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + for i in '${namespaces[@]}' + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete deployment -l kubevirt.io No resources found + _kubectl -n kube-system delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete rs -l kubevirt.io No resources found + _kubectl -n kube-system delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete services -l kubevirt.io No resources found + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n kube-system delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete secrets -l kubevirt.io No resources found + _kubectl -n kube-system delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete pv -l kubevirt.io No resources found + _kubectl -n kube-system delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete pvc -l kubevirt.io No resources found + _kubectl -n kube-system delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete ds -l kubevirt.io No resources found + _kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n kube-system delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete pods -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete rolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete roles -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete clusterroles -l kubevirt.io No resources found + _kubectl -n kube-system delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io ++ export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig ++ wc -l ++ cluster/k8s-1.10.3/.kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + sleep 2 + echo Done Done ./cluster/deploy.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.10.3-release ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.10.3-release0 ++ job_prefix=kubevirt-functional-tests-k8s-1.10.3-release0 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-99-g9e88a00 ++ KUBEVIRT_VERSION=v0.7.0-99-g9e88a00 + source cluster/k8s-1.10.3/provider.sh ++ set -e ++ image=k8s-1.10.3@sha256:d6290260e7e6b84419984f12719cf592ccbe327373b8df76aa0481f8ec01d357 ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ source hack/config-default.sh source hack/config-k8s-1.10.3.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.10.3.sh ++ source hack/config-provider-k8s-1.10.3.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.3/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.3/.kubectl +++ docker_prefix=localhost:35003/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Deploying ...' Deploying ... + [[ -z k8s-1.10.3-release ]] + [[ k8s-1.10.3-release =~ .*-dev ]] + [[ k8s-1.10.3-release =~ .*-release ]] + for manifest in '${MANIFESTS_OUT_DIR}/release/*' + [[ /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/demo-content.yaml =~ .*demo.* ]] + continue + for manifest in '${MANIFESTS_OUT_DIR}/release/*' + [[ /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml =~ .*demo.* ]] + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml clusterrole.rbac.authorization.k8s.io "kubevirt.io:admin" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:edit" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:view" created serviceaccount "kubevirt-apiserver" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-apiserver-auth-delegator" created rolebinding.rbac.authorization.k8s.io "kubevirt-apiserver" created role.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrole.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrole.rbac.authorization.k8s.io "kubevirt-controller" created serviceaccount "kubevirt-controller" created serviceaccount "kubevirt-privileged" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-controller" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-controller-cluster-admin" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-privileged-cluster-admin" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:default" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt.io:default" created service "virt-api" created deployment.extensions "virt-api" created deployment.extensions "virt-controller" created daemonset.extensions "virt-handler" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstances.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstancereplicasets.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstancepresets.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachines.kubevirt.io" created + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R persistentvolumeclaim "disk-alpine" created persistentvolume "host-path-disk-alpine" created persistentvolumeclaim "disk-custom" created persistentvolume "host-path-disk-custom" created daemonset.extensions "disks-images-provider" created serviceaccount "kubevirt-testing" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-testing-cluster-admin" created + [[ k8s-1.10.3 =~ os-* ]] + echo Done Done + namespaces=(kube-system default) + [[ kube-system != \k\u\b\e\-\s\y\s\t\e\m ]] + timeout=300 + sample=30 + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n kube-system --no-headers ++ cluster/kubectl.sh get pods -n kube-system --no-headers ++ grep -v Running + '[' -n 'virt-api-7d79764579-h8lwp 0/1 ContainerCreating 0 4s virt-api-7d79764579-wccxk 0/1 ContainerCreating 0 4s virt-controller-7d57d96b65-6l7pp 0/1 ContainerCreating 0 4s virt-controller-7d57d96b65-bgmjz 0/1 ContainerCreating 0 4s virt-handler-8zs9s 0/1 ContainerCreating 0 4s virt-handler-d4p7v 0/1 ContainerCreating 0 4s' ']' + echo 'Waiting for kubevirt pods to enter the Running state ...' Waiting for kubevirt pods to enter the Running state ... + kubectl get pods -n kube-system --no-headers + cluster/kubectl.sh get pods -n kube-system --no-headers + grep -v Running disks-images-provider-jcfpz 0/1 ContainerCreating 0 2s disks-images-provider-jr7kq 0/1 ContainerCreating 0 2s virt-api-7d79764579-h8lwp 0/1 ContainerCreating 0 5s virt-api-7d79764579-wccxk 0/1 ContainerCreating 0 5s virt-controller-7d57d96b65-6l7pp 0/1 ContainerCreating 0 5s virt-controller-7d57d96b65-bgmjz 0/1 ContainerCreating 0 5s virt-handler-8zs9s 0/1 ContainerCreating 0 5s virt-handler-d4p7v 0/1 ContainerCreating 0 5s + sleep 30 + current_time=30 + '[' 30 -gt 300 ']' ++ kubectl get pods -n kube-system --no-headers ++ grep -v Running ++ cluster/kubectl.sh get pods -n kube-system --no-headers + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false + '[' -n '' ']' + kubectl get pods -n kube-system + cluster/kubectl.sh get pods -n kube-system NAME READY STATUS RESTARTS AGE disks-images-provider-jcfpz 1/1 Running 0 38s disks-images-provider-jr7kq 1/1 Running 0 38s etcd-node01 1/1 Running 0 12m kube-apiserver-node01 1/1 Running 0 13m kube-controller-manager-node01 1/1 Running 0 12m kube-dns-86f4d74b45-pxxr2 3/3 Running 0 13m kube-flannel-ds-5ckld 1/1 Running 0 13m kube-flannel-ds-ccwjh 1/1 Running 0 13m kube-proxy-qdzz8 1/1 Running 0 13m kube-proxy-vp52x 1/1 Running 0 13m kube-scheduler-node01 1/1 Running 0 12m virt-api-7d79764579-h8lwp 1/1 Running 1 41s virt-api-7d79764579-wccxk 1/1 Running 0 41s virt-controller-7d57d96b65-6l7pp 1/1 Running 0 41s virt-controller-7d57d96b65-bgmjz 1/1 Running 0 41s virt-handler-8zs9s 1/1 Running 0 41s virt-handler-d4p7v 1/1 Running 0 41s + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n default --no-headers ++ cluster/kubectl.sh get pods -n default --no-headers ++ grep -v Running No resources found. + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n '' ']' + kubectl get pods -n default + cluster/kubectl.sh get pods -n default No resources found. + kubectl version + cluster/kubectl.sh version Client Version: version.Info{Major:"1", Minor:"10", GitVersion:"v1.10.3", GitCommit:"2bba0127d85d5a46ab4b778548be28623b32d0b0", GitTreeState:"clean", BuildDate:"2018-05-21T09:17:39Z", GoVersion:"go1.9.3", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"10", GitVersion:"v1.10.3", GitCommit:"2bba0127d85d5a46ab4b778548be28623b32d0b0", GitTreeState:"clean", BuildDate:"2018-05-21T09:05:37Z", GoVersion:"go1.9.3", Compiler:"gc", Platform:"linux/amd64"} + ginko_params='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/junit.xml' + [[ k8s-1.10.3-release =~ windows.* ]] + FUNC_TEST_ARGS='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.3-release/junit.xml' + make functest hack/dockerized "hack/build-func-tests.sh" sha256:7fb8539d32771bf74786d31102b8c102fc61586b172276b4710c6944077751f4 go version go1.10 linux/amd64 go version go1.10 linux/amd64 Compiling tests... compiled tests.test hack/functests.sh Running Suite: Tests Suite ========================== Random Seed: 1532369608 Will run 144 of 144 specs • [SLOW TEST:51.184 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:35 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:64 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a cirros image /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 should return that we are running cirros /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:67 ------------------------------ • [SLOW TEST:64.840 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:35 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:64 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a fedora image /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:76 should return that we are running fedora /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:77 ------------------------------ • [SLOW TEST:40.325 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:35 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:64 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 should be able to reconnect to console multiple times /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:86 ------------------------------ •• ------------------------------ • [SLOW TEST:18.740 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 should start it /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:80 ------------------------------ • [SLOW TEST:18.314 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 should attach virt-launcher to it /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:86 ------------------------------ •••• ------------------------------ • [SLOW TEST:37.587 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 with boot order /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:174 should be able to boot from selected disk /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 Alpine as first boot /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:30.006 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 with boot order /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:174 should be able to boot from selected disk /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 Cirros as first boot /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:15.081 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 with user-data /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:205 without k8s secret /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:206 should retry starting the VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:207 ------------------------------ • [SLOW TEST:19.029 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 with user-data /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:205 without k8s secret /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:206 should log warning and proceed once the secret is there /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:237 ------------------------------ • [SLOW TEST:41.443 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 when virt-launcher crashes /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:285 should be stopped and have Failed phase /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:286 ------------------------------ • [SLOW TEST:26.051 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 when virt-handler crashes /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:308 should recover and continue management /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:309 ------------------------------ • [SLOW TEST:48.692 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 when virt-handler is responsive /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:339 should indicate that a node is ready for vmis /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:340 ------------------------------ • [SLOW TEST:152.720 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 when virt-handler is not responsive /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:370 the node controller should react /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:409 ------------------------------ • [SLOW TEST:65.231 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 with non default namespace /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:462 should log libvirt start and stop lifecycle events of the domain /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 kubevirt-test-default /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:41.519 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 with non default namespace /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:462 should log libvirt start and stop lifecycle events of the domain /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 kubevirt-test-alternative /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.103 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 VirtualMachineInstance Emulation Mode /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:519 should enable emulation in virt-launcher [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:531 Software emulation is not enabled on this cluster /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:527 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.100 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 VirtualMachineInstance Emulation Mode /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:519 should be reflected in domain XML [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:568 Software emulation is not enabled on this cluster /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:527 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.272 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 VirtualMachineInstance Emulation Mode /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:519 should request a TUN device but not KVM [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:612 Software emulation is not enabled on this cluster /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:527 ------------------------------ •••• ------------------------------ • [SLOW TEST:20.165 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Delete a VirtualMachineInstance's Pod /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:764 should result in the VirtualMachineInstance moving to a finalized state /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:765 ------------------------------ • [SLOW TEST:74.209 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Delete a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:796 with an active pod. /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:797 should result in pod being terminated /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:798 ------------------------------ • [SLOW TEST:43.982 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Delete a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:796 with ACPI and 0 grace period seconds /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:822 should result in vmi status failed /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:823 ------------------------------ • [SLOW TEST:50.307 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Delete a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:796 with ACPI and some grace period seconds /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:847 should result in vmi status succeeded /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:848 ------------------------------ • [SLOW TEST:23.508 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Delete a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:796 with grace period greater than 0 /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:872 should run graceful shutdown /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:873 ------------------------------ • [SLOW TEST:68.218 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Killed VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:924 should be in Failed phase /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:925 ------------------------------ Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 18:29:28 http: TLS handshake error from 10.244.1.1:34088: EOF 2018/07/23 18:29:38 http: TLS handshake error from 10.244.1.1:34094: EOF 2018/07/23 18:29:48 http: TLS handshake error from 10.244.1.1:34100: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running 2018/07/23 18:29:09 http: TLS handshake error from 10.244.0.1:45288: EOF level=info timestamp=2018-07-23T18:29:11.990604Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:29:12.674584Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:29:13.616079Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:29:16.268423Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 18:29:19 http: TLS handshake error from 10.244.0.1:45312: EOF level=info timestamp=2018-07-23T18:29:28.534975Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 18:29:29 http: TLS handshake error from 10.244.0.1:45336: EOF 2018/07/23 18:29:39 http: TLS handshake error from 10.244.0.1:45360: EOF level=info timestamp=2018-07-23T18:29:41.926722Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:29:42.808838Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:29:43.732069Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:29:49 http: TLS handshake error from 10.244.0.1:45384: EOF level=info timestamp=2018-07-23T18:29:50.978626Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:29:50.982192Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T18:29:14.474877Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-bgmjz Pod phase: Running level=info timestamp=2018-07-23T18:25:03.929536Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvminv669 kind= uid=b7ca022a-8ea5-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:26:18.074072Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5wblk kind= uid=e3f878f7-8ea5-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:26:18.078576Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5wblk kind= uid=e3f878f7-8ea5-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:26:18.309363Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi5wblk\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi5wblk" level=info timestamp=2018-07-23T18:27:02.201144Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilvpwb kind= uid=fe3b3894-8ea5-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:27:02.204041Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilvpwb kind= uid=fe3b3894-8ea5-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:27:02.416495Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmilvpwb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmilvpwb" level=info timestamp=2018-07-23T18:27:52.424949Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:27:52.425247Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:27:52.594837Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6qw6l\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6qw6l" level=info timestamp=2018-07-23T18:28:15.991816Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4d8m kind= uid=2a44f87f-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:28:15.995197Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4d8m kind= uid=2a44f87f-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:28:16.085047Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiv4d8m\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiv4d8m" level=info timestamp=2018-07-23T18:29:54.384459Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivpz65 kind= uid=5306a2d4-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:29:54.392023Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivpz65 kind= uid=5306a2d4-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T18:28:15.789441Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T18:28:15.795111Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.795207Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.795290Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.795548Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.818810Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.819174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.819246Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.819344Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.819483Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.820577Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.825049Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: false\n" level=info timestamp=2018-07-23T18:28:15.825168Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.825406Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.825587Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T18:29:15.450287Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmiv4d8m kind=VirtualMachineInstance uid=2a44f87f-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:29:15.450479Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmiv4d8m, existing: true\n" level=info timestamp=2018-07-23T18:29:15.450523Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:29:15.450555Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:29:15.450721Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmiv4d8m kind= uid=2a44f87f-8ea6-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T18:29:15.450793Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmiv4d8m kind= uid=2a44f87f-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:29:23.852213Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmiv4d8m, existing: true\n" level=info timestamp=2018-07-23T18:29:23.853790Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:29:23.853890Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:29:23.854492Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiv4d8m kind= uid=2a44f87f-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:29:23.855201Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmiv4d8m kind= uid=2a44f87f-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:29:23.953534Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmiv4d8m, existing: false\n" level=info timestamp=2018-07-23T18:29:23.953704Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:29:23.953859Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiv4d8m kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:29:23.954140Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmiv4d8m kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmivpz65-b6lwt Pod phase: Pending • Failure [31.015 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Killed VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:924 should be left alone by virt-handler [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:952 Expected <*errors.StatusError | 0xc42016aa20>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } to be nil /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:955 ------------------------------ STEP: Starting a VirtualMachineInstance Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 18:29:28 http: TLS handshake error from 10.244.1.1:34088: EOF 2018/07/23 18:29:38 http: TLS handshake error from 10.244.1.1:34094: EOF 2018/07/23 18:29:48 http: TLS handshake error from 10.244.1.1:34100: EOF 2018/07/23 18:29:58 http: TLS handshake error from 10.244.1.1:34106: EOF 2018/07/23 18:30:08 http: TLS handshake error from 10.244.1.1:34112: EOF 2018/07/23 18:30:18 http: TLS handshake error from 10.244.1.1:34118: EOF 2018/07/23 18:30:28 http: TLS handshake error from 10.244.1.1:34124: EOF 2018/07/23 18:30:38 http: TLS handshake error from 10.244.1.1:34130: EOF 2018/07/23 18:30:48 http: TLS handshake error from 10.244.1.1:34136: EOF 2018/07/23 18:30:58 http: TLS handshake error from 10.244.1.1:34142: EOF 2018/07/23 18:31:08 http: TLS handshake error from 10.244.1.1:34148: EOF 2018/07/23 18:31:18 http: TLS handshake error from 10.244.1.1:34154: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running level=info timestamp=2018-07-23T18:30:12.931938Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:30:13.856453Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:30:19 http: TLS handshake error from 10.244.0.1:45458: EOF 2018/07/23 18:30:29 http: TLS handshake error from 10.244.0.1:45482: EOF 2018/07/23 18:30:39 http: TLS handshake error from 10.244.0.1:45506: EOF level=info timestamp=2018-07-23T18:30:41.903675Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:30:43.051245Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:30:43.972618Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:30:49 http: TLS handshake error from 10.244.0.1:45530: EOF 2018/07/23 18:30:59 http: TLS handshake error from 10.244.0.1:45554: EOF 2018/07/23 18:31:09 http: TLS handshake error from 10.244.0.1:45578: EOF level=info timestamp=2018-07-23T18:31:11.980497Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:31:13.180242Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:31:14.080025Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:31:19 http: TLS handshake error from 10.244.0.1:45602: EOF Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T18:29:14.474877Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-bgmjz Pod phase: Running level=info timestamp=2018-07-23T18:26:18.309363Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi5wblk\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi5wblk" level=info timestamp=2018-07-23T18:27:02.201144Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilvpwb kind= uid=fe3b3894-8ea5-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:27:02.204041Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilvpwb kind= uid=fe3b3894-8ea5-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:27:02.416495Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmilvpwb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmilvpwb" level=info timestamp=2018-07-23T18:27:52.424949Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:27:52.425247Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:27:52.594837Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6qw6l\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6qw6l" level=info timestamp=2018-07-23T18:28:15.991816Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4d8m kind= uid=2a44f87f-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:28:15.995197Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4d8m kind= uid=2a44f87f-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:28:16.085047Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiv4d8m\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiv4d8m" level=info timestamp=2018-07-23T18:29:54.384459Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivpz65 kind= uid=5306a2d4-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:29:54.392023Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivpz65 kind= uid=5306a2d4-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:30:25.157923Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:30:25.159504Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:30:25.377265Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8xbk9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8xbk9" Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T18:28:15.789441Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T18:28:15.795111Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.795207Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.795290Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.795548Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.818810Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.819174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.819246Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.819344Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.819483Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.820577Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.825049Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: false\n" level=info timestamp=2018-07-23T18:28:15.825168Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.825406Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.825587Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T18:30:41.475918Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="No update processing required" level=error timestamp=2018-07-23T18:30:41.490665Z pos=vm.go:431 component=virt-handler namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8xbk9\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-07-23T18:30:41.490806Z pos=vm.go:251 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8xbk9\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi8xbk9" level=info timestamp=2018-07-23T18:30:41.490876Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi8xbk9, existing: true\n" level=info timestamp=2018-07-23T18:30:41.490899Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-23T18:30:41.490927Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:30:41.490947Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:30:41.491013Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-23T18:30:41.502331Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:30:41.502400Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi8xbk9, existing: true\n" level=info timestamp=2018-07-23T18:30:41.502437Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-23T18:30:41.502467Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:30:41.502487Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:30:41.502551Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-23T18:30:41.507651Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi8xbk9-vfvjk Pod phase: Running level=info timestamp=2018-07-23T18:30:40.720473Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-23T18:30:40.743547Z pos=virt-launcher.go:215 component=virt-launcher msg="Detected domain with UUID 68444b15-62b1-4961-9537-1742743455f0" level=info timestamp=2018-07-23T18:30:40.745730Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-23T18:30:40.751448Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:30:41.359355Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-23T18:30:41.408664Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Domain started." level=info timestamp=2018-07-23T18:30:41.408841Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:30:41.410675Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:30:41.413579Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:30:41.413886Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-23T18:30:41.451598Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:30:41.453940Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:30:41.495217Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:30:41.507163Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:30:41.752978Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 68444b15-62b1-4961-9537-1742743455f0: 183" • Failure [92.597 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 with Alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:71 should be successfully started /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 with Disk PVC [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 92.035s. Expected error: <*errors.StatusError | 0xc420632d80>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 ------------------------------ STEP: Starting a VirtualMachineInstance Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 18:30:38 http: TLS handshake error from 10.244.1.1:34130: EOF 2018/07/23 18:30:48 http: TLS handshake error from 10.244.1.1:34136: EOF 2018/07/23 18:30:58 http: TLS handshake error from 10.244.1.1:34142: EOF 2018/07/23 18:31:08 http: TLS handshake error from 10.244.1.1:34148: EOF 2018/07/23 18:31:18 http: TLS handshake error from 10.244.1.1:34154: EOF 2018/07/23 18:31:28 http: TLS handshake error from 10.244.1.1:34160: EOF 2018/07/23 18:31:38 http: TLS handshake error from 10.244.1.1:34166: EOF 2018/07/23 18:31:48 http: TLS handshake error from 10.244.1.1:34172: EOF 2018/07/23 18:31:58 http: TLS handshake error from 10.244.1.1:34178: EOF 2018/07/23 18:32:08 http: TLS handshake error from 10.244.1.1:34184: EOF 2018/07/23 18:32:18 http: TLS handshake error from 10.244.1.1:34190: EOF 2018/07/23 18:32:28 http: TLS handshake error from 10.244.1.1:34196: EOF 2018/07/23 18:32:38 http: TLS handshake error from 10.244.1.1:34202: EOF 2018/07/23 18:32:48 http: TLS handshake error from 10.244.1.1:34208: EOF 2018/07/23 18:32:58 http: TLS handshake error from 10.244.1.1:34214: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running level=info timestamp=2018-07-23T18:31:50.986730Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:31:50.990096Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 18:31:59 http: TLS handshake error from 10.244.0.1:45698: EOF 2018/07/23 18:32:09 http: TLS handshake error from 10.244.0.1:45722: EOF level=info timestamp=2018-07-23T18:32:12.010101Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:32:13.487775Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:32:14.334773Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:32:19 http: TLS handshake error from 10.244.0.1:45746: EOF 2018/07/23 18:32:29 http: TLS handshake error from 10.244.0.1:45770: EOF 2018/07/23 18:32:39 http: TLS handshake error from 10.244.0.1:45794: EOF level=info timestamp=2018-07-23T18:32:41.904887Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:32:43.623186Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:32:44.451190Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:32:49 http: TLS handshake error from 10.244.0.1:45818: EOF 2018/07/23 18:32:59 http: TLS handshake error from 10.244.0.1:45842: EOF Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T18:29:14.474877Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-bgmjz Pod phase: Running level=info timestamp=2018-07-23T18:27:02.416495Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmilvpwb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmilvpwb" level=info timestamp=2018-07-23T18:27:52.424949Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:27:52.425247Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:27:52.594837Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6qw6l\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6qw6l" level=info timestamp=2018-07-23T18:28:15.991816Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4d8m kind= uid=2a44f87f-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:28:15.995197Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4d8m kind= uid=2a44f87f-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:28:16.085047Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiv4d8m\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiv4d8m" level=info timestamp=2018-07-23T18:29:54.384459Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivpz65 kind= uid=5306a2d4-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:29:54.392023Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivpz65 kind= uid=5306a2d4-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:30:25.157923Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:30:25.159504Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:30:25.377265Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8xbk9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8xbk9" level=info timestamp=2018-07-23T18:31:57.761246Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:31:57.762339Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:31:57.949591Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid97vz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid97vz" Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T18:28:15.789441Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T18:28:15.795111Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.795207Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.795290Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.795548Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.818810Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.819174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.819246Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.819344Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.819483Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.820577Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.825049Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: false\n" level=info timestamp=2018-07-23T18:28:15.825168Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.825406Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.825587Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T18:32:16.182978Z pos=vm.go:756 component=virt-handler namespace=kubevirt-test-default name=testvmid97vz kind=Domain uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-23T18:32:16.223434Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-23T18:32:16.229078Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:32:16.229278Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmid97vz, existing: true\n" level=info timestamp=2018-07-23T18:32:16.229346Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-07-23T18:32:16.229384Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:32:16.229415Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:32:16.229543Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T18:32:16.289446Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:32:16.289546Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmid97vz, existing: true\n" level=info timestamp=2018-07-23T18:32:16.289568Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-23T18:32:16.289594Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:32:16.289613Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:32:16.289685Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-23T18:32:16.303078Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmid97vz-6rlw5 Pod phase: Running level=info timestamp=2018-07-23T18:32:14.775307Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-23T18:32:15.493905Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-23T18:32:15.513773Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:32:15.889445Z pos=virt-launcher.go:215 component=virt-launcher msg="Detected domain with UUID 667cb997-aedc-49bd-a9e6-f3e0fbc2b419" level=info timestamp=2018-07-23T18:32:15.890464Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-23T18:32:16.150180Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-23T18:32:16.178321Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:32:16.184668Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:32:16.184781Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-23T18:32:16.216220Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Domain started." level=info timestamp=2018-07-23T18:32:16.220504Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:32:16.221820Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:32:16.226365Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:32:16.302717Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:32:16.899917Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 667cb997-aedc-49bd-a9e6-f3e0fbc2b419: 183" • Failure [92.615 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 with Alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:71 should be successfully started /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 with CDRom PVC [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 92.025s. Expected error: <*errors.StatusError | 0xc4205ca120>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 ------------------------------ STEP: Starting a VirtualMachineInstance Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 18:32:08 http: TLS handshake error from 10.244.1.1:34184: EOF 2018/07/23 18:32:18 http: TLS handshake error from 10.244.1.1:34190: EOF 2018/07/23 18:32:28 http: TLS handshake error from 10.244.1.1:34196: EOF 2018/07/23 18:32:38 http: TLS handshake error from 10.244.1.1:34202: EOF 2018/07/23 18:32:48 http: TLS handshake error from 10.244.1.1:34208: EOF 2018/07/23 18:32:58 http: TLS handshake error from 10.244.1.1:34214: EOF 2018/07/23 18:33:08 http: TLS handshake error from 10.244.1.1:34220: EOF 2018/07/23 18:33:18 http: TLS handshake error from 10.244.1.1:34226: EOF 2018/07/23 18:33:28 http: TLS handshake error from 10.244.1.1:34232: EOF 2018/07/23 18:33:38 http: TLS handshake error from 10.244.1.1:34238: EOF 2018/07/23 18:33:48 http: TLS handshake error from 10.244.1.1:34244: EOF 2018/07/23 18:33:58 http: TLS handshake error from 10.244.1.1:34250: EOF 2018/07/23 18:34:08 http: TLS handshake error from 10.244.1.1:34256: EOF 2018/07/23 18:34:18 http: TLS handshake error from 10.244.1.1:34262: EOF 2018/07/23 18:34:28 http: TLS handshake error from 10.244.1.1:34268: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running 2018/07/23 18:33:29 http: TLS handshake error from 10.244.0.1:45914: EOF 2018/07/23 18:33:39 http: TLS handshake error from 10.244.0.1:45938: EOF level=info timestamp=2018-07-23T18:33:42.037054Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:33:44.172536Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:33:44.658099Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:33:49 http: TLS handshake error from 10.244.0.1:45962: EOF level=info timestamp=2018-07-23T18:33:49.728753Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:33:49.733099Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 18:33:59 http: TLS handshake error from 10.244.0.1:45986: EOF 2018/07/23 18:34:09 http: TLS handshake error from 10.244.0.1:46010: EOF level=info timestamp=2018-07-23T18:34:11.886575Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:34:14.289100Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:34:14.780240Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:34:19 http: TLS handshake error from 10.244.0.1:46034: EOF 2018/07/23 18:34:29 http: TLS handshake error from 10.244.0.1:46058: EOF Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T18:29:14.474877Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-bgmjz Pod phase: Running level=info timestamp=2018-07-23T18:27:52.425247Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:27:52.594837Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6qw6l\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6qw6l" level=info timestamp=2018-07-23T18:28:15.991816Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4d8m kind= uid=2a44f87f-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:28:15.995197Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4d8m kind= uid=2a44f87f-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:28:16.085047Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiv4d8m\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiv4d8m" level=info timestamp=2018-07-23T18:29:54.384459Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivpz65 kind= uid=5306a2d4-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:29:54.392023Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivpz65 kind= uid=5306a2d4-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:30:25.157923Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:30:25.159504Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:30:25.377265Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8xbk9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8xbk9" level=info timestamp=2018-07-23T18:31:57.761246Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:31:57.762339Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:31:57.949591Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid97vz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid97vz" level=info timestamp=2018-07-23T18:33:30.408577Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:33:30.410083Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T18:28:15.789441Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T18:28:15.795111Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.795207Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.795290Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.795548Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.818810Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.819174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.819246Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.819344Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.819483Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.820577Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.825049Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: false\n" level=info timestamp=2018-07-23T18:28:15.825168Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.825406Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.825587Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T18:33:46.865274Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="No update processing required" level=error timestamp=2018-07-23T18:33:46.873944Z pos=vm.go:431 component=virt-handler namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi7vjqw\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-07-23T18:33:46.874081Z pos=vm.go:251 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi7vjqw\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi7vjqw" level=info timestamp=2018-07-23T18:33:46.874143Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi7vjqw, existing: true\n" level=info timestamp=2018-07-23T18:33:46.874162Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-23T18:33:46.874184Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:33:46.874201Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:33:46.874256Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-23T18:33:46.878494Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:33:46.881249Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi7vjqw, existing: true\n" level=info timestamp=2018-07-23T18:33:46.881293Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-23T18:33:46.881323Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:33:46.881341Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:33:46.881405Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-23T18:33:46.884711Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi7vjqw-p8xqr Pod phase: Running level=info timestamp=2018-07-23T18:33:46.140227Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-23T18:33:46.161108Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:33:46.170850Z pos=virt-launcher.go:215 component=virt-launcher msg="Detected domain with UUID e5812eb8-1356-4e66-b116-4da425ada026" level=info timestamp=2018-07-23T18:33:46.171569Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-23T18:33:46.753286Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-23T18:33:46.782969Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:33:46.785088Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:33:46.803417Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-23T18:33:46.821049Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Domain started." level=info timestamp=2018-07-23T18:33:46.822681Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:33:46.830739Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:33:46.832456Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:33:46.878201Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:33:46.884346Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:33:47.176600Z pos=monitor.go:222 component=virt-launcher msg="Found PID for e5812eb8-1356-4e66-b116-4da425ada026: 180" • Failure [92.910 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 with Alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:71 should be successfully started and stopped multiple times /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 with Disk PVC [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 92.027s. Expected error: <*errors.StatusError | 0xc4205ca1b0>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 ------------------------------ STEP: Starting and stopping the VirtualMachineInstance number of times STEP: Starting a VirtualMachineInstance Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 18:33:38 http: TLS handshake error from 10.244.1.1:34238: EOF 2018/07/23 18:33:48 http: TLS handshake error from 10.244.1.1:34244: EOF 2018/07/23 18:33:58 http: TLS handshake error from 10.244.1.1:34250: EOF 2018/07/23 18:34:08 http: TLS handshake error from 10.244.1.1:34256: EOF 2018/07/23 18:34:18 http: TLS handshake error from 10.244.1.1:34262: EOF 2018/07/23 18:34:28 http: TLS handshake error from 10.244.1.1:34268: EOF 2018/07/23 18:34:38 http: TLS handshake error from 10.244.1.1:34274: EOF 2018/07/23 18:34:48 http: TLS handshake error from 10.244.1.1:34280: EOF 2018/07/23 18:34:58 http: TLS handshake error from 10.244.1.1:34286: EOF 2018/07/23 18:35:08 http: TLS handshake error from 10.244.1.1:34292: EOF 2018/07/23 18:35:18 http: TLS handshake error from 10.244.1.1:34298: EOF 2018/07/23 18:35:28 http: TLS handshake error from 10.244.1.1:34304: EOF 2018/07/23 18:35:38 http: TLS handshake error from 10.244.1.1:34310: EOF 2018/07/23 18:35:48 http: TLS handshake error from 10.244.1.1:34316: EOF 2018/07/23 18:35:58 http: TLS handshake error from 10.244.1.1:34322: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running level=info timestamp=2018-07-23T18:34:49.738057Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:34:49.746531Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 18:34:59 http: TLS handshake error from 10.244.0.1:46130: EOF 2018/07/23 18:35:09 http: TLS handshake error from 10.244.0.1:46154: EOF level=info timestamp=2018-07-23T18:35:11.989151Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:35:14.563565Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:35:15.019141Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:35:19 http: TLS handshake error from 10.244.0.1:46178: EOF 2018/07/23 18:35:29 http: TLS handshake error from 10.244.0.1:46202: EOF 2018/07/23 18:35:39 http: TLS handshake error from 10.244.0.1:46226: EOF level=info timestamp=2018-07-23T18:35:42.074403Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:35:44.719591Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:35:45.134594Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:35:49 http: TLS handshake error from 10.244.0.1:46250: EOF 2018/07/23 18:35:59 http: TLS handshake error from 10.244.0.1:46274: EOF Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T18:29:14.474877Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-bgmjz Pod phase: Running level=info timestamp=2018-07-23T18:28:15.991816Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4d8m kind= uid=2a44f87f-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:28:15.995197Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4d8m kind= uid=2a44f87f-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:28:16.085047Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiv4d8m\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiv4d8m" level=info timestamp=2018-07-23T18:29:54.384459Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivpz65 kind= uid=5306a2d4-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:29:54.392023Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivpz65 kind= uid=5306a2d4-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:30:25.157923Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:30:25.159504Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:30:25.377265Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8xbk9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8xbk9" level=info timestamp=2018-07-23T18:31:57.761246Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:31:57.762339Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:31:57.949591Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid97vz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid97vz" level=info timestamp=2018-07-23T18:33:30.408577Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:33:30.410083Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:35:03.426836Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:35:03.430779Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T18:28:15.789441Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T18:28:15.795111Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.795207Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.795290Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.795548Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.818810Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.819174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.819246Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.819344Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.819483Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.820577Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.825049Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: false\n" level=info timestamp=2018-07-23T18:28:15.825168Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.825406Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.825587Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T18:35:20.784879Z pos=vm.go:756 component=virt-handler namespace=kubevirt-test-default name=testvmiw66d5 kind=Domain uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-23T18:35:20.804346Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-23T18:35:20.806195Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:35:20.806270Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmiw66d5, existing: true\n" level=info timestamp=2018-07-23T18:35:20.806291Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-07-23T18:35:20.806325Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:35:20.806350Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:35:20.806398Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T18:35:20.830478Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:35:20.830549Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmiw66d5, existing: true\n" level=info timestamp=2018-07-23T18:35:20.830569Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-23T18:35:20.830593Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:35:20.830610Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:35:20.830669Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-23T18:35:20.834787Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmiw66d5-9lwsx Pod phase: Running level=info timestamp=2018-07-23T18:35:19.444634Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-23T18:35:20.223723Z pos=virt-launcher.go:215 component=virt-launcher msg="Detected domain with UUID 68912a24-4abe-48a8-aeb1-dc674210b866" level=info timestamp=2018-07-23T18:35:20.224581Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-23T18:35:20.227793Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-23T18:35:20.271119Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:35:20.755830Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-23T18:35:20.783215Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:35:20.785257Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:35:20.788222Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-23T18:35:20.799750Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Domain started." level=info timestamp=2018-07-23T18:35:20.802163Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:35:20.803289Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:35:20.808593Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:35:20.834478Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:35:21.231254Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 68912a24-4abe-48a8-aeb1-dc674210b866: 184" • Failure [92.715 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 with Alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:71 should be successfully started and stopped multiple times /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 with CDRom PVC [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 92.027s. Expected error: <*errors.StatusError | 0xc420efa360>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 ------------------------------ STEP: Starting and stopping the VirtualMachineInstance number of times STEP: Starting a VirtualMachineInstance Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 18:35:08 http: TLS handshake error from 10.244.1.1:34292: EOF 2018/07/23 18:35:18 http: TLS handshake error from 10.244.1.1:34298: EOF 2018/07/23 18:35:28 http: TLS handshake error from 10.244.1.1:34304: EOF 2018/07/23 18:35:38 http: TLS handshake error from 10.244.1.1:34310: EOF 2018/07/23 18:35:48 http: TLS handshake error from 10.244.1.1:34316: EOF 2018/07/23 18:35:58 http: TLS handshake error from 10.244.1.1:34322: EOF 2018/07/23 18:36:08 http: TLS handshake error from 10.244.1.1:34328: EOF 2018/07/23 18:36:18 http: TLS handshake error from 10.244.1.1:34334: EOF 2018/07/23 18:36:28 http: TLS handshake error from 10.244.1.1:34340: EOF 2018/07/23 18:36:38 http: TLS handshake error from 10.244.1.1:34346: EOF 2018/07/23 18:36:48 http: TLS handshake error from 10.244.1.1:34352: EOF 2018/07/23 18:36:58 http: TLS handshake error from 10.244.1.1:34358: EOF 2018/07/23 18:37:08 http: TLS handshake error from 10.244.1.1:34364: EOF 2018/07/23 18:37:18 http: TLS handshake error from 10.244.1.1:34370: EOF 2018/07/23 18:37:28 http: TLS handshake error from 10.244.1.1:34376: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running 2018/07/23 18:36:29 http: TLS handshake error from 10.244.0.1:46346: EOF 2018/07/23 18:36:39 http: TLS handshake error from 10.244.0.1:46370: EOF level=info timestamp=2018-07-23T18:36:41.982376Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:36:44.981233Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:36:45.377498Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:36:49 http: TLS handshake error from 10.244.0.1:46394: EOF level=info timestamp=2018-07-23T18:36:49.752245Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:36:49.755022Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 18:36:59 http: TLS handshake error from 10.244.0.1:46418: EOF 2018/07/23 18:37:09 http: TLS handshake error from 10.244.0.1:46442: EOF level=info timestamp=2018-07-23T18:37:11.940216Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:37:15.105757Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:37:15.490301Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:37:19 http: TLS handshake error from 10.244.0.1:46466: EOF 2018/07/23 18:37:29 http: TLS handshake error from 10.244.0.1:46490: EOF Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T18:29:14.474877Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-bgmjz Pod phase: Running level=info timestamp=2018-07-23T18:28:16.085047Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiv4d8m\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiv4d8m" level=info timestamp=2018-07-23T18:29:54.384459Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivpz65 kind= uid=5306a2d4-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:29:54.392023Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivpz65 kind= uid=5306a2d4-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:30:25.157923Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:30:25.159504Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:30:25.377265Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8xbk9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8xbk9" level=info timestamp=2018-07-23T18:31:57.761246Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:31:57.762339Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:31:57.949591Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid97vz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid97vz" level=info timestamp=2018-07-23T18:33:30.408577Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:33:30.410083Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:35:03.426836Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:35:03.430779Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:36:36.088311Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:36:36.090471Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T18:28:15.789441Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T18:28:15.795111Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.795207Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.795290Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.795548Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.818810Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.819174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.819246Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.819344Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.819483Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.820577Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.825049Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: false\n" level=info timestamp=2018-07-23T18:28:15.825168Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.825406Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.825587Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T18:36:55.221300Z pos=vm.go:756 component=virt-handler namespace=kubevirt-test-default name=testvmi846lq kind=Domain uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-23T18:36:55.273365Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-23T18:36:55.271952Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:36:55.273750Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi846lq, existing: true\n" level=info timestamp=2018-07-23T18:36:55.273789Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-07-23T18:36:55.273826Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:36:55.273859Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:36:55.273968Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T18:36:55.324793Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:36:55.325514Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi846lq, existing: true\n" level=info timestamp=2018-07-23T18:36:55.325565Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-23T18:36:55.325604Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:36:55.325629Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:36:55.325717Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-23T18:36:55.331737Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi846lq-kmf75 Pod phase: Running level=info timestamp=2018-07-23T18:36:53.370658Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-23T18:36:54.566247Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-23T18:36:54.585256Z pos=virt-launcher.go:215 component=virt-launcher msg="Detected domain with UUID 5c89e45d-7bc7-44ed-9f53-83b40d996ae6" level=info timestamp=2018-07-23T18:36:54.587574Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-23T18:36:54.590837Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:36:55.154252Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-23T18:36:55.219538Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:36:55.221675Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:36:55.235693Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-23T18:36:55.255893Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Domain started." level=info timestamp=2018-07-23T18:36:55.266124Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:36:55.268390Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:36:55.277141Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:36:55.331341Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:36:55.596741Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 5c89e45d-7bc7-44ed-9f53-83b40d996ae6: 198" • Failure [92.641 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With an emptyDisk defined /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:113 should create a writeable emptyDisk with the right capacity [It] /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:115 Timed out after 92.025s. Expected error: <*errors.StatusError | 0xc420632d80>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 ------------------------------ STEP: Starting a VirtualMachineInstance Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 18:36:48 http: TLS handshake error from 10.244.1.1:34352: EOF 2018/07/23 18:36:58 http: TLS handshake error from 10.244.1.1:34358: EOF 2018/07/23 18:37:08 http: TLS handshake error from 10.244.1.1:34364: EOF 2018/07/23 18:37:18 http: TLS handshake error from 10.244.1.1:34370: EOF 2018/07/23 18:37:28 http: TLS handshake error from 10.244.1.1:34376: EOF 2018/07/23 18:37:38 http: TLS handshake error from 10.244.1.1:34382: EOF 2018/07/23 18:37:48 http: TLS handshake error from 10.244.1.1:34388: EOF 2018/07/23 18:37:58 http: TLS handshake error from 10.244.1.1:34394: EOF 2018/07/23 18:38:08 http: TLS handshake error from 10.244.1.1:34400: EOF 2018/07/23 18:38:18 http: TLS handshake error from 10.244.1.1:34406: EOF 2018/07/23 18:38:28 http: TLS handshake error from 10.244.1.1:34412: EOF 2018/07/23 18:38:38 http: TLS handshake error from 10.244.1.1:34418: EOF 2018/07/23 18:38:48 http: TLS handshake error from 10.244.1.1:34424: EOF 2018/07/23 18:38:58 http: TLS handshake error from 10.244.1.1:34430: EOF 2018/07/23 18:39:08 http: TLS handshake error from 10.244.1.1:34436: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running level=info timestamp=2018-07-23T18:38:15.394305Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:38:15.725814Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:38:19 http: TLS handshake error from 10.244.0.1:46610: EOF 2018/07/23 18:38:29 http: TLS handshake error from 10.244.0.1:46634: EOF 2018/07/23 18:38:39 http: TLS handshake error from 10.244.0.1:46658: EOF level=info timestamp=2018-07-23T18:38:41.778420Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:38:41.800527Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:38:42.032413Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:38:45.540877Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:38:45.843442Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:38:49 http: TLS handshake error from 10.244.0.1:46682: EOF level=info timestamp=2018-07-23T18:38:51.036901Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:38:51.042550Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 18:38:59 http: TLS handshake error from 10.244.0.1:46706: EOF 2018/07/23 18:39:09 http: TLS handshake error from 10.244.0.1:46730: EOF Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T18:29:14.474877Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-bgmjz Pod phase: Running level=info timestamp=2018-07-23T18:30:25.157923Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:30:25.159504Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8xbk9 kind= uid=655fd57b-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:30:25.377265Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8xbk9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8xbk9" level=info timestamp=2018-07-23T18:31:57.761246Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:31:57.762339Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:31:57.949591Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid97vz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid97vz" level=info timestamp=2018-07-23T18:33:30.408577Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:33:30.410083Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:35:03.426836Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:35:03.430779Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:36:36.088311Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:36:36.090471Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:38:08.727754Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:38:08.728451Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:38:08.908320Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9f8bq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9f8bq" Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T18:28:15.789441Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T18:28:15.795111Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.795207Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.795290Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.795548Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.818810Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.819174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.819246Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.819344Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.819483Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.820577Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.825049Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: false\n" level=info timestamp=2018-07-23T18:28:15.825168Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.825406Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.825587Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T18:38:27.971188Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:38:27.971354Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-23T18:38:27.990694Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-23T18:38:27.998223Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:38:27.998426Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9f8bq, existing: true\n" level=info timestamp=2018-07-23T18:38:27.998480Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-23T18:38:27.998545Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:38:27.998590Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:38:27.998789Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-23T18:38:28.008424Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:38:30.096973Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmi846lq kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T18:38:30.097384Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi846lq, existing: false\n" level=info timestamp=2018-07-23T18:38:30.097475Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:38:30.097606Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi846lq kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:38:30.099150Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi846lq kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi846lq-kmf75 Pod phase: Pending Pod name: virt-launcher-testvmi9f8bq-vdxj6 Pod phase: Running level=info timestamp=2018-07-23T18:38:27.098576Z pos=virt-launcher.go:215 component=virt-launcher msg="Detected domain with UUID b723b4a7-e974-4588-a698-065d7a84d0cd" level=info timestamp=2018-07-23T18:38:27.099706Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-23T18:38:27.102251Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-23T18:38:27.121985Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:38:27.810293Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-23T18:38:27.888463Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:38:27.899070Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:38:27.909999Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-23T18:38:27.915144Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Domain started." level=info timestamp=2018-07-23T18:38:27.920283Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:38:27.987353Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:38:27.997181Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:38:27.996983Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:38:28.007774Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:38:28.110490Z pos=monitor.go:222 component=virt-launcher msg="Found PID for b723b4a7-e974-4588-a698-065d7a84d0cd: 197" • Failure [92.745 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With an emptyDisk defined and a specified serial number /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:163 should create a writeable emptyDisk with the specified serial number [It] /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:165 Timed out after 92.029s. Expected error: <*errors.StatusError | 0xc420efaf30>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 ------------------------------ STEP: Starting a VirtualMachineInstance Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 18:38:18 http: TLS handshake error from 10.244.1.1:34406: EOF 2018/07/23 18:38:28 http: TLS handshake error from 10.244.1.1:34412: EOF 2018/07/23 18:38:38 http: TLS handshake error from 10.244.1.1:34418: EOF 2018/07/23 18:38:48 http: TLS handshake error from 10.244.1.1:34424: EOF 2018/07/23 18:38:58 http: TLS handshake error from 10.244.1.1:34430: EOF 2018/07/23 18:39:08 http: TLS handshake error from 10.244.1.1:34436: EOF 2018/07/23 18:39:18 http: TLS handshake error from 10.244.1.1:34442: EOF 2018/07/23 18:39:28 http: TLS handshake error from 10.244.1.1:34448: EOF 2018/07/23 18:39:38 http: TLS handshake error from 10.244.1.1:34454: EOF 2018/07/23 18:39:48 http: TLS handshake error from 10.244.1.1:34460: EOF 2018/07/23 18:39:58 http: TLS handshake error from 10.244.1.1:34466: EOF 2018/07/23 18:40:08 http: TLS handshake error from 10.244.1.1:34472: EOF 2018/07/23 18:40:18 http: TLS handshake error from 10.244.1.1:34478: EOF 2018/07/23 18:40:28 http: TLS handshake error from 10.244.1.1:34484: EOF 2018/07/23 18:40:38 http: TLS handshake error from 10.244.1.1:34490: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running level=info timestamp=2018-07-23T18:39:41.930272Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:39:45.782459Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:39:46.089185Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:39:49 http: TLS handshake error from 10.244.0.1:46826: EOF level=info timestamp=2018-07-23T18:39:51.052206Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:39:51.059331Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 18:39:59 http: TLS handshake error from 10.244.0.1:46850: EOF 2018/07/23 18:40:09 http: TLS handshake error from 10.244.0.1:46874: EOF level=info timestamp=2018-07-23T18:40:11.918544Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:40:15.908285Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:40:16.180006Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:40:19 http: TLS handshake error from 10.244.0.1:46898: EOF 2018/07/23 18:40:29 http: TLS handshake error from 10.244.0.1:46922: EOF 2018/07/23 18:40:39 http: TLS handshake error from 10.244.0.1:46946: EOF level=info timestamp=2018-07-23T18:40:41.947239Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T18:29:14.474877Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-bgmjz Pod phase: Running level=info timestamp=2018-07-23T18:31:57.761246Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:31:57.762339Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid97vz kind= uid=9c920f05-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:31:57.949591Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid97vz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid97vz" level=info timestamp=2018-07-23T18:33:30.408577Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:33:30.410083Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:35:03.426836Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:35:03.430779Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:36:36.088311Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:36:36.090471Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:38:08.727754Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:38:08.728451Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:38:08.908320Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9f8bq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9f8bq" level=info timestamp=2018-07-23T18:39:41.551222Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:39:41.552271Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:39:41.727769Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8wsx6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8wsx6" Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T18:28:15.789441Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T18:28:15.795111Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.795207Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.795290Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.795548Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.818810Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.819174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.819246Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.819344Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.819483Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.820577Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.825049Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: false\n" level=info timestamp=2018-07-23T18:28:15.825168Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.825406Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.825587Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T18:40:00.240367Z pos=vm.go:756 component=virt-handler namespace=kubevirt-test-default name=testvmi8wsx6 kind=Domain uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-23T18:40:00.258111Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-23T18:40:00.265658Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:40:00.265799Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi8wsx6, existing: true\n" level=info timestamp=2018-07-23T18:40:00.265821Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-07-23T18:40:00.265847Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:40:00.265870Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:40:00.265927Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T18:40:00.286581Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:40:00.286658Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi8wsx6, existing: true\n" level=info timestamp=2018-07-23T18:40:00.286688Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-23T18:40:00.286712Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:40:00.286740Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:40:00.286814Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-23T18:40:00.289965Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi8wsx6-9lbjh Pod phase: Running level=info timestamp=2018-07-23T18:39:58.663313Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-23T18:39:59.463010Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-23T18:39:59.469997Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:39:59.480321Z pos=virt-launcher.go:215 component=virt-launcher msg="Detected domain with UUID f7fc8033-0c95-4923-bcea-16817186743f" level=info timestamp=2018-07-23T18:39:59.481441Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-23T18:40:00.209198Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-23T18:40:00.235195Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:40:00.240750Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:40:00.245266Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-23T18:40:00.254087Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Domain started." level=info timestamp=2018-07-23T18:40:00.255821Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:40:00.258650Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:40:00.262548Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:40:00.289639Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:40:00.486644Z pos=monitor.go:222 component=virt-launcher msg="Found PID for f7fc8033-0c95-4923-bcea-16817186743f: 192" • Failure [92.750 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With ephemeral alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:205 should be successfully started [It] /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:207 Timed out after 92.033s. Expected error: <*errors.StatusError | 0xc4201362d0>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 ------------------------------ STEP: Starting a VirtualMachineInstance Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 18:39:48 http: TLS handshake error from 10.244.1.1:34460: EOF 2018/07/23 18:39:58 http: TLS handshake error from 10.244.1.1:34466: EOF 2018/07/23 18:40:08 http: TLS handshake error from 10.244.1.1:34472: EOF 2018/07/23 18:40:18 http: TLS handshake error from 10.244.1.1:34478: EOF 2018/07/23 18:40:28 http: TLS handshake error from 10.244.1.1:34484: EOF 2018/07/23 18:40:38 http: TLS handshake error from 10.244.1.1:34490: EOF 2018/07/23 18:40:48 http: TLS handshake error from 10.244.1.1:34496: EOF 2018/07/23 18:40:58 http: TLS handshake error from 10.244.1.1:34502: EOF 2018/07/23 18:41:08 http: TLS handshake error from 10.244.1.1:34508: EOF 2018/07/23 18:41:18 http: TLS handshake error from 10.244.1.1:34514: EOF 2018/07/23 18:41:28 http: TLS handshake error from 10.244.1.1:34520: EOF 2018/07/23 18:41:38 http: TLS handshake error from 10.244.1.1:34526: EOF 2018/07/23 18:41:48 http: TLS handshake error from 10.244.1.1:34532: EOF 2018/07/23 18:41:58 http: TLS handshake error from 10.244.1.1:34538: EOF 2018/07/23 18:42:08 http: TLS handshake error from 10.244.1.1:34544: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running level=info timestamp=2018-07-23T18:41:11.968357Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:41:16.136201Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:41:16.394451Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:41:19 http: TLS handshake error from 10.244.0.1:47042: EOF 2018/07/23 18:41:29 http: TLS handshake error from 10.244.0.1:47066: EOF 2018/07/23 18:41:39 http: TLS handshake error from 10.244.0.1:47090: EOF level=info timestamp=2018-07-23T18:41:41.980515Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:41:46.258375Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:41:46.500147Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:41:49 http: TLS handshake error from 10.244.0.1:47114: EOF level=info timestamp=2018-07-23T18:41:51.069778Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:41:51.073383Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 18:41:59 http: TLS handshake error from 10.244.0.1:47138: EOF 2018/07/23 18:42:09 http: TLS handshake error from 10.244.0.1:47162: EOF level=info timestamp=2018-07-23T18:42:11.952261Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T18:29:14.474877Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-bgmjz Pod phase: Running level=info timestamp=2018-07-23T18:31:57.949591Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid97vz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid97vz" level=info timestamp=2018-07-23T18:33:30.408577Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:33:30.410083Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:35:03.426836Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:35:03.430779Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:36:36.088311Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:36:36.090471Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:38:08.727754Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:38:08.728451Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:38:08.908320Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9f8bq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9f8bq" level=info timestamp=2018-07-23T18:39:41.551222Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:39:41.552271Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:39:41.727769Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8wsx6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8wsx6" level=info timestamp=2018-07-23T18:41:14.199968Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:41:14.200912Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T18:28:15.789441Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T18:28:15.795111Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.795207Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.795290Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.795548Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.818810Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.819174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.819246Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.819344Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.819483Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.820577Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.825049Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: false\n" level=info timestamp=2018-07-23T18:28:15.825168Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.825406Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.825587Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T18:41:31.261184Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 msg="No update processing required" level=error timestamp=2018-07-23T18:41:31.278193Z pos=vm.go:431 component=virt-handler namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi64rz8\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-07-23T18:41:31.278409Z pos=vm.go:251 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi64rz8\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi64rz8" level=info timestamp=2018-07-23T18:41:31.278540Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi64rz8, existing: true\n" level=info timestamp=2018-07-23T18:41:31.278600Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-23T18:41:31.278644Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:41:31.278674Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:41:31.278804Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-23T18:41:31.285683Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:41:31.285792Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi64rz8, existing: true\n" level=info timestamp=2018-07-23T18:41:31.285829Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-23T18:41:31.285874Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:41:31.285904Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:41:31.286103Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-23T18:41:31.294262Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi64rz8-bzsg8 Pod phase: Running level=info timestamp=2018-07-23T18:41:30.368989Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-23T18:41:30.381974Z pos=virt-launcher.go:215 component=virt-launcher msg="Detected domain with UUID a3e082eb-a506-473e-8dd2-8eeff0d1aac1" level=info timestamp=2018-07-23T18:41:30.382288Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-23T18:41:30.385467Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:41:31.132607Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-23T18:41:31.157641Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:41:31.159734Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:41:31.175412Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-23T18:41:31.182994Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 msg="Domain started." level=info timestamp=2018-07-23T18:41:31.184580Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:41:31.200190Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:41:31.202162Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:41:31.285160Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:41:31.292229Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:41:31.390203Z pos=monitor.go:222 component=virt-launcher msg="Found PID for a3e082eb-a506-473e-8dd2-8eeff0d1aac1: 187" • Failure [92.697 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With ephemeral alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:205 should not persist data [It] /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:218 Timed out after 92.032s. Expected error: <*errors.StatusError | 0xc42016aa20>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 ------------------------------ STEP: Starting the VirtualMachineInstance STEP: Starting a VirtualMachineInstance Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 18:41:58 http: TLS handshake error from 10.244.1.1:34538: EOF 2018/07/23 18:42:08 http: TLS handshake error from 10.244.1.1:34544: EOF 2018/07/23 18:42:18 http: TLS handshake error from 10.244.1.1:34550: EOF 2018/07/23 18:42:28 http: TLS handshake error from 10.244.1.1:34556: EOF 2018/07/23 18:42:38 http: TLS handshake error from 10.244.1.1:34562: EOF 2018/07/23 18:42:48 http: TLS handshake error from 10.244.1.1:34568: EOF 2018/07/23 18:42:58 http: TLS handshake error from 10.244.1.1:34574: EOF 2018/07/23 18:43:08 http: TLS handshake error from 10.244.1.1:34580: EOF 2018/07/23 18:43:18 http: TLS handshake error from 10.244.1.1:34586: EOF 2018/07/23 18:43:28 http: TLS handshake error from 10.244.1.1:34592: EOF 2018/07/23 18:43:38 http: TLS handshake error from 10.244.1.1:34598: EOF 2018/07/23 18:43:48 http: TLS handshake error from 10.244.1.1:34604: EOF 2018/07/23 18:43:58 http: TLS handshake error from 10.244.1.1:34610: EOF 2018/07/23 18:44:08 http: TLS handshake error from 10.244.1.1:34616: EOF 2018/07/23 18:44:18 http: TLS handshake error from 10.244.1.1:34622: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running 2018/07/23 18:43:19 http: TLS handshake error from 10.244.0.1:47330: EOF 2018/07/23 18:43:29 http: TLS handshake error from 10.244.0.1:47354: EOF 2018/07/23 18:43:39 http: TLS handshake error from 10.244.0.1:47378: EOF level=info timestamp=2018-07-23T18:43:42.090340Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:43:46.736956Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:43:47.017415Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:43:49 http: TLS handshake error from 10.244.0.1:47402: EOF level=info timestamp=2018-07-23T18:43:49.858055Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:43:49.861343Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 18:43:59 http: TLS handshake error from 10.244.0.1:47426: EOF 2018/07/23 18:44:09 http: TLS handshake error from 10.244.0.1:47450: EOF level=info timestamp=2018-07-23T18:44:12.014054Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:44:16.882614Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:44:17.110947Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:44:19 http: TLS handshake error from 10.244.0.1:47474: EOF Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T18:29:14.474877Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-bgmjz Pod phase: Running level=info timestamp=2018-07-23T18:33:30.410083Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7vjqw kind= uid=d3cb54ab-8ea6-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:35:03.426836Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:35:03.430779Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw66d5 kind= uid=0b3ad4d8-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:36:36.088311Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:36:36.090471Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:38:08.727754Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:38:08.728451Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:38:08.908320Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9f8bq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9f8bq" level=info timestamp=2018-07-23T18:39:41.551222Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:39:41.552271Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:39:41.727769Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8wsx6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8wsx6" level=info timestamp=2018-07-23T18:41:14.199968Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:41:14.200912Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:42:46.961397Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:42:46.963823Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T18:28:15.789441Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T18:28:15.795111Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.795207Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.795290Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.795548Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.818810Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.819174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.819246Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.819344Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.819483Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.820577Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.825049Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: false\n" level=info timestamp=2018-07-23T18:28:15.825168Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.825406Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.825587Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T18:43:05.033936Z pos=vm.go:756 component=virt-handler namespace=kubevirt-test-default name=testvmiqs7v7 kind=Domain uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-23T18:43:05.056386Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-23T18:43:05.058571Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:43:05.058655Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmiqs7v7, existing: true\n" level=info timestamp=2018-07-23T18:43:05.058677Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-07-23T18:43:05.058708Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:43:05.058737Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:43:05.058794Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T18:43:05.140221Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:43:05.149569Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmiqs7v7, existing: true\n" level=info timestamp=2018-07-23T18:43:05.149669Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-23T18:43:05.149708Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:43:05.149759Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:43:05.149962Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-23T18:43:05.155742Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmiqs7v7-h5hzv Pod phase: Running level=info timestamp=2018-07-23T18:43:03.525742Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-23T18:43:04.408811Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-23T18:43:04.418159Z pos=virt-launcher.go:215 component=virt-launcher msg="Detected domain with UUID 699e0c6a-4870-4bd3-92a9-a92724bd1f6f" level=info timestamp=2018-07-23T18:43:04.418582Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-23T18:43:04.422713Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:43:05.013211Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-23T18:43:05.029240Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:43:05.034372Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:43:05.040224Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-23T18:43:05.050819Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Domain started." level=info timestamp=2018-07-23T18:43:05.053934Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:43:05.055080Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:43:05.081610Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:43:05.155152Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:43:05.423825Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 699e0c6a-4870-4bd3-92a9-a92724bd1f6f: 189" • Failure [123.799 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With VirtualMachineInstance with two PVCs /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:266 should start vmi multiple times [It] /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:278 Timed out after 123.048s. Expected error: <*errors.StatusError | 0xc42016a7e0>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 ------------------------------ STEP: Starting and stopping the VirtualMachineInstance number of times STEP: Starting a VirtualMachineInstance Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 18:42:28 http: TLS handshake error from 10.244.1.1:34556: EOF 2018/07/23 18:42:38 http: TLS handshake error from 10.244.1.1:34562: EOF 2018/07/23 18:42:48 http: TLS handshake error from 10.244.1.1:34568: EOF 2018/07/23 18:42:58 http: TLS handshake error from 10.244.1.1:34574: EOF 2018/07/23 18:43:08 http: TLS handshake error from 10.244.1.1:34580: EOF 2018/07/23 18:43:18 http: TLS handshake error from 10.244.1.1:34586: EOF 2018/07/23 18:43:28 http: TLS handshake error from 10.244.1.1:34592: EOF 2018/07/23 18:43:38 http: TLS handshake error from 10.244.1.1:34598: EOF 2018/07/23 18:43:48 http: TLS handshake error from 10.244.1.1:34604: EOF 2018/07/23 18:43:58 http: TLS handshake error from 10.244.1.1:34610: EOF 2018/07/23 18:44:08 http: TLS handshake error from 10.244.1.1:34616: EOF 2018/07/23 18:44:18 http: TLS handshake error from 10.244.1.1:34622: EOF 2018/07/23 18:44:28 http: TLS handshake error from 10.244.1.1:34628: EOF 2018/07/23 18:44:38 http: TLS handshake error from 10.244.1.1:34634: EOF 2018/07/23 18:44:48 http: TLS handshake error from 10.244.1.1:34640: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running level=info timestamp=2018-07-23T18:43:49.861343Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 18:43:59 http: TLS handshake error from 10.244.0.1:47426: EOF 2018/07/23 18:44:09 http: TLS handshake error from 10.244.0.1:47450: EOF level=info timestamp=2018-07-23T18:44:12.014054Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:44:16.882614Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:44:17.110947Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:44:19 http: TLS handshake error from 10.244.0.1:47474: EOF 2018/07/23 18:44:29 http: TLS handshake error from 10.244.0.1:47498: EOF 2018/07/23 18:44:39 http: TLS handshake error from 10.244.0.1:47522: EOF level=info timestamp=2018-07-23T18:44:41.970565Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:44:47.047236Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:44:47.179205Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:44:49 http: TLS handshake error from 10.244.0.1:47546: EOF level=info timestamp=2018-07-23T18:44:49.867899Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:44:49.870375Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T18:29:14.474877Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-bgmjz Pod phase: Running level=info timestamp=2018-07-23T18:36:36.088311Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:36:36.090471Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi846lq kind= uid=4271fc6f-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:38:08.727754Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:38:08.728451Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:38:08.908320Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9f8bq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9f8bq" level=info timestamp=2018-07-23T18:39:41.551222Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:39:41.552271Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:39:41.727769Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8wsx6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8wsx6" level=info timestamp=2018-07-23T18:41:14.199968Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:41:14.200912Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:42:46.961397Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:42:46.963823Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:44:50.471876Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv6cks kind= uid=69218b73-8ea8-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:44:50.474840Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv6cks kind= uid=69218b73-8ea8-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:44:50.840604Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiv6cks\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiv6cks" Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T18:28:15.789441Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T18:28:15.795111Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.795207Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.795290Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.795548Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.818810Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.819174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.819246Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.819344Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.819483Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.820577Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.825049Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: false\n" level=info timestamp=2018-07-23T18:28:15.825168Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.825406Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.825587Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T18:43:05.033936Z pos=vm.go:756 component=virt-handler namespace=kubevirt-test-default name=testvmiqs7v7 kind=Domain uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-23T18:43:05.056386Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-23T18:43:05.058571Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:43:05.058655Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmiqs7v7, existing: true\n" level=info timestamp=2018-07-23T18:43:05.058677Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-07-23T18:43:05.058708Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:43:05.058737Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:43:05.058794Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T18:43:05.140221Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:43:05.149569Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmiqs7v7, existing: true\n" level=info timestamp=2018-07-23T18:43:05.149669Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-23T18:43:05.149708Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:43:05.149759Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:43:05.149962Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-23T18:43:05.155742Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmiqs7v7-h5hzv Pod phase: Running level=info timestamp=2018-07-23T18:43:03.525742Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-23T18:43:04.408811Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-23T18:43:04.418159Z pos=virt-launcher.go:215 component=virt-launcher msg="Detected domain with UUID 699e0c6a-4870-4bd3-92a9-a92724bd1f6f" level=info timestamp=2018-07-23T18:43:04.418582Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-23T18:43:04.422713Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:43:05.013211Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-23T18:43:05.029240Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:43:05.034372Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:43:05.040224Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-23T18:43:05.050819Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Domain started." level=info timestamp=2018-07-23T18:43:05.053934Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:43:05.055080Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:43:05.081610Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:43:05.155152Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:43:05.423825Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 699e0c6a-4870-4bd3-92a9-a92724bd1f6f: 189" Pod name: virt-launcher-testvmiv6cks-v89pn Pod phase: Pending • Failure in Spec Setup (BeforeEach) [30.597 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on a VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:61 Expose ClusterIP service [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:68 Should expose a Cluster IP service on a VM and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:71 Expected error: <*errors.StatusError | 0xc420633170>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:27 ------------------------------ Service node-port-vm successfully exposed for virtualmachineinstance testvmirmslp Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 18:44:08 http: TLS handshake error from 10.244.1.1:34616: EOF 2018/07/23 18:44:18 http: TLS handshake error from 10.244.1.1:34622: EOF 2018/07/23 18:44:28 http: TLS handshake error from 10.244.1.1:34628: EOF 2018/07/23 18:44:38 http: TLS handshake error from 10.244.1.1:34634: EOF 2018/07/23 18:44:48 http: TLS handshake error from 10.244.1.1:34640: EOF 2018/07/23 18:44:58 http: TLS handshake error from 10.244.1.1:34646: EOF 2018/07/23 18:45:08 http: TLS handshake error from 10.244.1.1:34652: EOF 2018/07/23 18:45:18 http: TLS handshake error from 10.244.1.1:34658: EOF 2018/07/23 18:45:28 http: TLS handshake error from 10.244.1.1:34664: EOF 2018/07/23 18:45:38 http: TLS handshake error from 10.244.1.1:34670: EOF 2018/07/23 18:45:48 http: TLS handshake error from 10.244.1.1:34676: EOF 2018/07/23 18:45:58 http: TLS handshake error from 10.244.1.1:34682: EOF 2018/07/23 18:46:08 http: TLS handshake error from 10.244.1.1:34688: EOF 2018/07/23 18:46:18 http: TLS handshake error from 10.244.1.1:34694: EOF 2018/07/23 18:46:28 http: TLS handshake error from 10.244.1.1:34700: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running level=info timestamp=2018-07-23T18:45:17.419256Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:45:19 http: TLS handshake error from 10.244.0.1:47628: EOF 2018/07/23 18:45:29 http: TLS handshake error from 10.244.0.1:47652: EOF 2018/07/23 18:45:39 http: TLS handshake error from 10.244.0.1:47678: EOF level=info timestamp=2018-07-23T18:45:41.938424Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:45:47.420062Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:45:47.467055Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:45:49 http: TLS handshake error from 10.244.0.1:47702: EOF 2018/07/23 18:45:59 http: TLS handshake error from 10.244.0.1:47726: EOF 2018/07/23 18:46:09 http: TLS handshake error from 10.244.0.1:47750: EOF level=info timestamp=2018-07-23T18:46:11.916320Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T18:46:17.520233Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T18:46:17.565112Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 18:46:19 http: TLS handshake error from 10.244.0.1:47774: EOF 2018/07/23 18:46:29 http: TLS handshake error from 10.244.0.1:47798: EOF Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T18:29:14.474877Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-bgmjz Pod phase: Running level=info timestamp=2018-07-23T18:38:08.727754Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:38:08.728451Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9f8bq kind= uid=79acc48a-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:38:08.908320Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9f8bq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9f8bq" level=info timestamp=2018-07-23T18:39:41.551222Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:39:41.552271Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8wsx6 kind= uid=b102bf82-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:39:41.727769Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8wsx6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8wsx6" level=info timestamp=2018-07-23T18:41:14.199968Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:41:14.200912Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi64rz8 kind= uid=e83a1b8f-8ea7-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:42:46.961397Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:42:46.963823Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:44:50.471876Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv6cks kind= uid=69218b73-8ea8-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:44:50.474840Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv6cks kind= uid=69218b73-8ea8-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-23T18:44:50.840604Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiv6cks\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiv6cks" level=info timestamp=2018-07-23T18:44:51.095045Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirmslp kind= uid=7b5e125f-8ea8-11e8-9512-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-23T18:44:51.096041Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirmslp kind= uid=7b5e125f-8ea8-11e8-9512-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T18:28:15.789441Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T18:28:15.795111Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.795207Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.795290Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.795548Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.818810Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.819174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: true\n" level=info timestamp=2018-07-23T18:28:15.819246Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T18:28:15.819344Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.819483Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.820577Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind= uid=1c39a484-8ea6-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:28:15.825049Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6qw6l, existing: false\n" level=info timestamp=2018-07-23T18:28:15.825168Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T18:28:15.825406Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T18:28:15.825587Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6qw6l kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T18:45:07.784363Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmirmslp kind= uid=7b5e125f-8ea8-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:45:07.796756Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-23T18:45:07.798780Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmiv6cks kind= uid=69218b73-8ea8-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:45:07.798839Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmiv6cks, existing: true\n" level=info timestamp=2018-07-23T18:45:07.798859Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-07-23T18:45:07.798883Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:45:07.798900Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:45:07.798942Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmiv6cks kind= uid=69218b73-8ea8-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T18:45:07.831824Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmiv6cks kind= uid=69218b73-8ea8-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T18:45:07.831909Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmiv6cks, existing: true\n" level=info timestamp=2018-07-23T18:45:07.831929Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-23T18:45:07.831961Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T18:45:07.831980Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-23T18:45:07.832087Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmiv6cks kind= uid=69218b73-8ea8-11e8-9512-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-23T18:45:07.840324Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmiv6cks kind= uid=69218b73-8ea8-11e8-9512-525500d15501 msg="Synchronization loop succeeded." Pod name: netcat9zshm Pod phase: Failed ++ head -n 1 +++ nc 192.168.66.101 30017 -i 1 -w 1 failed Ncat: Connection refused. + x= + echo '' + '[' '' = 'Hello World!' ']' + echo failed + exit 1 Pod name: virt-launcher-testvmiqs7v7-h5hzv Pod phase: Running level=info timestamp=2018-07-23T18:43:03.525742Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-23T18:43:04.408811Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-23T18:43:04.418159Z pos=virt-launcher.go:215 component=virt-launcher msg="Detected domain with UUID 699e0c6a-4870-4bd3-92a9-a92724bd1f6f" level=info timestamp=2018-07-23T18:43:04.418582Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-23T18:43:04.422713Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:43:05.013211Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-23T18:43:05.029240Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:43:05.034372Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:43:05.040224Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-23T18:43:05.050819Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Domain started." level=info timestamp=2018-07-23T18:43:05.053934Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:43:05.055080Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:43:05.081610Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:43:05.155152Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiqs7v7 kind= uid=1f84b5da-8ea8-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:43:05.423825Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 699e0c6a-4870-4bd3-92a9-a92724bd1f6f: 189" Pod name: virt-launcher-testvmirmslp-24c8f Pod phase: Running level=info timestamp=2018-07-23T18:45:06.531378Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-23T18:45:07.049987Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-23T18:45:07.057488Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:45:07.272295Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-23T18:45:07.323270Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:45:07.325555Z pos=virt-launcher.go:215 component=virt-launcher msg="Detected domain with UUID 7b1b5c17-8ebf-470d-b9e2-5fd8c7c4b69d" level=info timestamp=2018-07-23T18:45:07.325780Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-23T18:45:07.330632Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:45:07.330979Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-23T18:45:07.361489Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmirmslp kind= uid=7b5e125f-8ea8-11e8-9512-525500d15501 msg="Domain started." level=info timestamp=2018-07-23T18:45:07.368253Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmirmslp kind= uid=7b5e125f-8ea8-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:45:07.383994Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:45:07.410692Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:45:07.780571Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmirmslp kind= uid=7b5e125f-8ea8-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:45:08.331681Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 7b1b5c17-8ebf-470d-b9e2-5fd8c7c4b69d: 185" Pod name: virt-launcher-testvmiv6cks-v89pn Pod phase: Running level=info timestamp=2018-07-23T18:45:06.569518Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-23T18:45:07.040125Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-23T18:45:07.040905Z pos=virt-launcher.go:215 component=virt-launcher msg="Detected domain with UUID 2f997b8c-a0bd-406b-9dae-38fa1a021932" level=info timestamp=2018-07-23T18:45:07.041216Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-23T18:45:07.053445Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:45:07.309771Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-23T18:45:07.340771Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:45:07.343688Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:45:07.370211Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-23T18:45:07.785645Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmiv6cks kind= uid=69218b73-8ea8-11e8-9512-525500d15501 msg="Domain started." level=info timestamp=2018-07-23T18:45:07.788630Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiv6cks kind= uid=69218b73-8ea8-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:45:07.789495Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-23T18:45:07.799969Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-23T18:45:07.835280Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiv6cks kind= uid=69218b73-8ea8-11e8-9512-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-23T18:45:08.046470Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 2f997b8c-a0bd-406b-9dae-38fa1a021932: 183" • Failure [104.609 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on a VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:61 Expose NodePort service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:98 Should expose a NodePort service on a VM and connect to it [It] /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:103 Timed out after 60.000s. Expected : Failed to equal : Succeeded /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:134 ------------------------------ level=info timestamp=2018-07-23T18:44:52.482018Z pos=utils.go:254 component=tests msg="Created virtual machine pod virt-launcher-testvmirmslp-24c8f" level=info timestamp=2018-07-23T18:45:07.225252Z pos=utils.go:254 component=tests msg="Pod owner ship transferred to the node virt-launcher-testvmirmslp-24c8f" level=info timestamp=2018-07-23T18:45:08.914982Z pos=utils.go:254 component=tests msg="VirtualMachineInstance defined." level=info timestamp=2018-07-23T18:45:08.949226Z pos=utils.go:254 component=tests msg="VirtualMachineInstance started." STEP: Exposing the service via virtctl command STEP: Getting back the the service STEP: Getting the node IP from all nodes STEP: Starting a pod which tries to reach the VM via NodePort STEP: Waiting for the pod to report a successful connection attempt Service cluster-ip-udp-vm successfully exposed for virtualmachineinstance testvminfz9f • [SLOW TEST:54.256 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose UDP service on a VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:140 Expose ClusterIP UDP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:147 Should expose a ClusterIP service on a VM and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:151 ------------------------------ Service node-port-udp-vm successfully exposed for virtualmachineinstance testvminfz9f • [SLOW TEST:11.524 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose UDP service on a VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:140 Expose NodePort UDP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:179 Should expose a NodePort service on a VM and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:184 ------------------------------ Service cluster-ip-vmrs successfully exposed for vmirs replicasetxldr2 • [SLOW TEST:65.768 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on a VM replica set /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:227 Expose ClusterIP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:260 Should create a ClusterIP service on VMRS and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:264 ------------------------------ Service cluster-ip-ovm successfully exposed for virtualmachine testvmi8qzv2 VM testvmi8qzv2 was scheduled to start • [SLOW TEST:45.754 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on an Offline VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:292 Expose ClusterIP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:336 Connect to ClusterIP services that was set when VM was offline /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:337 ------------------------------ • [SLOW TEST:160.897 seconds] RegistryDisk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:41 Starting and stopping the same VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:90 with ephemeral registry disk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:91 should success multiple times /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:92 ------------------------------ • [SLOW TEST:17.234 seconds] RegistryDisk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:41 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:111 with ephemeral registry disk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:112 should not modify the spec on status update /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:113 ------------------------------ • [SLOW TEST:27.459 seconds] RegistryDisk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:41 Starting multiple VMIs /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:129 with ephemeral registry disk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:130 should success /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:131 ------------------------------ ••••••••••• ------------------------------ • [SLOW TEST:45.364 seconds] CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 with cloudInitNoCloud userDataBase64 source /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:81 should have cloud-init data /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:82 ------------------------------ • [SLOW TEST:99.073 seconds] CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 with cloudInitNoCloud userDataBase64 source /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:81 with injected ssh-key /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:92 should have ssh-key under authorized keys /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:93 ------------------------------ • [SLOW TEST:52.104 seconds] CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 with cloudInitNoCloud userData source /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:118 should process provided cloud-init data /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:119 ------------------------------ • [SLOW TEST:46.096 seconds] CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 should take user-data from k8s secret /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:162 ------------------------------ • [SLOW TEST:37.175 seconds] LeaderElection /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:43 Start a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:53 when the controller pod is not running /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:54 should success /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:55 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.013 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 should succeed to start a vmi [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:133 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1358 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.017 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 should succeed to stop a running vmi [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:139 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1358 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.020 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 with winrm connection [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:150 should have correct UUID /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:192 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1358 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.009 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 with winrm connection [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:150 should have pod IP /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:208 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1358 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.010 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 with kubectl command [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:226 should succeed to start a vmi /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:242 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1358 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.011 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 with kubectl command [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:226 should succeed to stop a vmi /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:250 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1358 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.002 seconds] Templates /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:42 Launching VMI from VM Template [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:60 with given Fedora Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:193 should succeed to generate a VM JSON file using oc-process command /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:150 Skip test that requires oc binary /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1399 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.002 seconds] Templates /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:42 Launching VMI from VM Template [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:60 with given Fedora Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:193 with given VM JSON from the Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:152 should succeed to create a VM using oc-create command /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:156 Skip test that requires oc binary /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1399 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.002 seconds] Templates /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:42 Launching VMI from VM Template [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:60 with given Fedora Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:193 with given VM JSON from the Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:152 with given VM from the VM JSON /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:158 should succeed to launch a VMI using oc-patch command /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:161 Skip test that requires oc binary /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1399 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.002 seconds] Templates /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:42 Launching VMI from VM Template [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:60 with given Fedora Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:193 with given VM JSON from the Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:152 with given VM from the VM JSON /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:158 with given VMI from the VM /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:163 should succeed to terminate the VMI using oc-patch command /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:166 Skip test that requires oc binary /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1399 ------------------------------ • [SLOW TEST:5.518 seconds] Subresource Api /root/go/src/kubevirt.io/kubevirt/tests/subresource_api_test.go:37 Rbac Authorization /root/go/src/kubevirt.io/kubevirt/tests/subresource_api_test.go:48 with correct permissions /root/go/src/kubevirt.io/kubevirt/tests/subresource_api_test.go:51 should be allowed to access subresource endpoint /root/go/src/kubevirt.io/kubevirt/tests/subresource_api_test.go:52 ------------------------------ • [SLOW TEST:5.026 seconds] Subresource Api /root/go/src/kubevirt.io/kubevirt/tests/subresource_api_test.go:37 Rbac Authorization /root/go/src/kubevirt.io/kubevirt/tests/subresource_api_test.go:48 Without permissions /root/go/src/kubevirt.io/kubevirt/tests/subresource_api_test.go:56 should not be able to access subresource endpoint /root/go/src/kubevirt.io/kubevirt/tests/subresource_api_test.go:57 ------------------------------ •••volumedisk0 compute ------------------------------ • [SLOW TEST:35.838 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 VirtualMachineInstance definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:55 with 3 CPU cores /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:56 should report 3 cpu cores under guest OS /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:62 ------------------------------ • [SLOW TEST:18.854 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 VirtualMachineInstance definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:55 with hugepages /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:108 should consume hugepages /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 hugepages-2Mi /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ S [SKIPPING] [0.235 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 VirtualMachineInstance definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:55 with hugepages /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:108 should consume hugepages /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 hugepages-1Gi [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 No node with hugepages hugepages-1Gi capacity /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:160 ------------------------------ • ------------------------------ • [SLOW TEST:100.386 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 with CPU spec /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:238 when CPU model defined /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:284 should report defined CPU model /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:285 ------------------------------ • [SLOW TEST:104.836 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 with CPU spec /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:238 when CPU model equals to passthrough /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:312 should report exactly the same model as node CPU /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:313 ------------------------------ • [SLOW TEST:104.776 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 with CPU spec /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:238 when CPU model not defined /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:336 should report CPU model from libvirt capabilities /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:337 ------------------------------ • [SLOW TEST:44.049 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 New VirtualMachineInstance with all supported drives /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:357 should have all the device nodes /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:380 ------------------------------ • [SLOW TEST:19.658 seconds] HookSidecars /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:40 VMI definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:58 with SM BIOS hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:59 should successfully start with hook sidecar annotation /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:60 ------------------------------ • [SLOW TEST:18.973 seconds] HookSidecars /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:40 VMI definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:58 with SM BIOS hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:59 should call Collect and OnDefineDomain on the hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:67 ------------------------------ • [SLOW TEST:21.291 seconds] HookSidecars /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:40 VMI definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:58 with SM BIOS hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:59 should update domain XML with SM BIOS properties /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:83 ------------------------------ • [SLOW TEST:15.357 seconds] User Access /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:33 With default kubevirt service accounts /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:41 should verify permissions are correct for view, edit, and admin /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 given a vmi /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:14.823 seconds] User Access /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:33 With default kubevirt service accounts /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:41 should verify permissions are correct for view, edit, and admin /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 given an vm /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:14.390 seconds] User Access /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:33 With default kubevirt service accounts /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:41 should verify permissions are correct for view, edit, and admin /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 given a vmi preset /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:14.590 seconds] User Access /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:33 With default kubevirt service accounts /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:41 should verify permissions are correct for view, edit, and admin /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 given a vmi replica set /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ •• ------------------------------ • [SLOW TEST:18.360 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should update VirtualMachine once VMIs are up /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:195 ------------------------------ • [SLOW TEST:12.371 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should remove VirtualMachineInstance once the VMI is marked for deletion /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:204 ------------------------------ • ------------------------------ • [SLOW TEST:106.491 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should recreate VirtualMachineInstance if it gets deleted /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:245 ------------------------------ • [SLOW TEST:89.511 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should recreate VirtualMachineInstance if the VirtualMachineInstance's pod gets deleted /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:265 ------------------------------ • [SLOW TEST:30.762 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should stop VirtualMachineInstance if running set to false /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:325 ------------------------------ • [SLOW TEST:370.038 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should start and stop VirtualMachineInstance multiple times /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:333 ------------------------------ • [SLOW TEST:79.556 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should not update the VirtualMachineInstance spec if Running /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:346 ------------------------------ • [SLOW TEST:280.100 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should survive guest shutdown, multiple times /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:387 ------------------------------ VM testvmitrj66 was scheduled to start • [SLOW TEST:18.875 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 Using virtctl interface /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:435 should start a VirtualMachineInstance once /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:436 ------------------------------ VM testvmivb25w was scheduled to stop • [SLOW TEST:31.140 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 Using virtctl interface /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:435 should stop a VirtualMachineInstance once /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:467 ------------------------------ • [SLOW TEST:17.752 seconds] VNC /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:54 with VNC connection /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:62 should allow accessing the VNC device /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:64 ------------------------------ •• ------------------------------ • [SLOW TEST:88.933 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be able to reach /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 the Inbound VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ •••• ------------------------------ • [SLOW TEST:5.193 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be reachable via the propagated IP from a Pod /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 on a different node from Pod /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ •••• ------------------------------ • [SLOW TEST:5.234 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 with a subdomain and a headless service given /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:315 should be able to reach the vmi via its unique fully qualified domain name /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:338 ------------------------------ • [SLOW TEST:33.583 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with custom interface model /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:364 should expose the right device type to the guest /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:365 ------------------------------ • ------------------------------ • [SLOW TEST:31.236 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with custom MAC address /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:398 should configure custom MAC address /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:399 ------------------------------ • [SLOW TEST:35.368 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with custom MAC address in non-conventional format /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:410 should configure custom MAC address /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:411 ------------------------------ • [SLOW TEST:34.200 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with custom MAC address and slirp interface /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:423 should configure custom MAC address /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:424 ------------------------------ • [SLOW TEST:44.036 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with disabled automatic attachment of interfaces /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:436 should not configure any external interfaces /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:437 ------------------------------ • [SLOW TEST:97.483 seconds] Slirp /root/go/src/kubevirt.io/kubevirt/tests/vmi_slirp_interface_test.go:39 should be able to /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 VirtualMachineInstance with slirp interface /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 19:32:27 http: TLS handshake error from 10.244.1.1:49890: EOF 2018/07/23 19:32:37 http: TLS handshake error from 10.244.1.1:49894: EOF 2018/07/23 19:32:47 http: TLS handshake error from 10.244.1.1:49900: EOF 2018/07/23 19:32:57 http: TLS handshake error from 10.244.1.1:49906: EOF 2018/07/23 19:33:07 http: TLS handshake error from 10.244.1.1:49912: EOF 2018/07/23 19:33:17 http: TLS handshake error from 10.244.1.1:49918: EOF 2018/07/23 19:33:27 http: TLS handshake error from 10.244.1.1:49924: EOF 2018/07/23 19:33:37 http: TLS handshake error from 10.244.1.1:49930: EOF 2018/07/23 19:33:47 http: TLS handshake error from 10.244.1.1:49936: EOF 2018/07/23 19:33:57 http: TLS handshake error from 10.244.1.1:49942: EOF 2018/07/23 19:34:07 http: TLS handshake error from 10.244.1.1:49948: EOF 2018/07/23 19:34:17 http: TLS handshake error from 10.244.1.1:49954: EOF 2018/07/23 19:34:27 http: TLS handshake error from 10.244.1.1:49960: EOF 2018/07/23 19:34:37 http: TLS handshake error from 10.244.1.1:49966: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running level=info timestamp=2018-07-23T19:33:37.886493Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:33:39 http: TLS handshake error from 10.244.0.1:55070: EOF level=info timestamp=2018-07-23T19:33:42.218471Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:33:49 http: TLS handshake error from 10.244.0.1:55094: EOF level=info timestamp=2018-07-23T19:33:50.118047Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T19:33:50.121091Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:33:59 http: TLS handshake error from 10.244.0.1:55118: EOF level=info timestamp=2018-07-23T19:34:08.018273Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:34:08.043046Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:34:09 http: TLS handshake error from 10.244.0.1:55142: EOF level=info timestamp=2018-07-23T19:34:12.087478Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:34:19 http: TLS handshake error from 10.244.0.1:55166: EOF 2018/07/23 19:34:29 http: TLS handshake error from 10.244.0.1:55190: EOF level=info timestamp=2018-07-23T19:34:38.255232Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:34:38.318532Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T19:32:17.030528Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-7xttj Pod phase: Running level=info timestamp=2018-07-23T18:57:10.466774Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 level=info timestamp=2018-07-23T19:32:02.083967Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-07-23T19:32:02.087303Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-07-23T19:32:02.087415Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-07-23T19:32:02.087474Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-07-23T19:32:02.087516Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-07-23T19:32:02.087558Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-07-23T19:32:02.087599Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-07-23T19:32:02.103221Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-07-23T19:32:02.177906Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-07-23T19:32:02.178203Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-07-23T19:32:02.178330Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-07-23T19:32:02.178495Z pos=preset.go:71 component=virt-controller service=http msg="Starting Virtual Machine Initializer." Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T19:31:17.602862Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.633990Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.792324Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.792409Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T19:31:17.792435Z pos=vm.go:331 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-07-23T19:31:17.792807Z pos=vm.go:358 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-23T19:31:17.792854Z pos=vm.go:410 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-07-23T19:31:17.793284Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.794411Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.798501Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-23T19:31:17.800770Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T19:31:17.800876Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.800898Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:31:17.800962Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:31:17.801106Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T19:32:17.734628Z pos=vm.go:210 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-07-23T19:32:17.735931Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-07-23T19:32:17.835179Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-23T19:32:17.850573Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-23T19:32:17.858483Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" level=info timestamp=2018-07-23T19:32:17.935916Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:32:17.936094Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-23T19:32:17.936295Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:32:17.936473Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid=f99df072-8eae-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T19:32:17.973055Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:32:17.973174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:32:17.973234Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:32:17.973274Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:32:17.973344Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T19:32:17.973431Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi9gm2g-m85m2 Pod phase: Pending ------------------------------ • Failure [202.116 seconds] Health Monitoring /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:37 A VirtualMachineInstance with a watchdog device /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:56 should be shut down when the watchdog expires [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:57 Expected error: : 180000000000 expect: timer expired after 180 seconds not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:64 ------------------------------ STEP: Starting a VirtualMachineInstance level=info timestamp=2018-07-23T19:31:21.027934Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmi9gm2g-m85m2" level=info timestamp=2018-07-23T19:31:36.371268Z pos=utils.go:243 component=tests msg="Pod owner ship transferred to the node virt-launcher-testvmi9gm2g-m85m2" level=info timestamp=2018-07-23T19:31:38.880893Z pos=utils.go:243 component=tests msg="VirtualMachineInstance defined." level=info timestamp=2018-07-23T19:31:38.902352Z pos=utils.go:243 component=tests msg="VirtualMachineInstance started." STEP: Expecting the VirtualMachineInstance console level=info timestamp=2018-07-23T19:34:39.050998Z pos=utils.go:1265 component=tests namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Login: [{2 \r\n\r\n\r\nISOLINUX 6.04 6.04-pre1 Copyright (C) 1994-2015 H. Peter Anvin et al\r\nboot: \u001b[?7h\r\n []}]" Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 19:32:47 http: TLS handshake error from 10.244.1.1:49900: EOF 2018/07/23 19:32:57 http: TLS handshake error from 10.244.1.1:49906: EOF 2018/07/23 19:33:07 http: TLS handshake error from 10.244.1.1:49912: EOF 2018/07/23 19:33:17 http: TLS handshake error from 10.244.1.1:49918: EOF 2018/07/23 19:33:27 http: TLS handshake error from 10.244.1.1:49924: EOF 2018/07/23 19:33:37 http: TLS handshake error from 10.244.1.1:49930: EOF 2018/07/23 19:33:47 http: TLS handshake error from 10.244.1.1:49936: EOF 2018/07/23 19:33:57 http: TLS handshake error from 10.244.1.1:49942: EOF 2018/07/23 19:34:07 http: TLS handshake error from 10.244.1.1:49948: EOF 2018/07/23 19:34:17 http: TLS handshake error from 10.244.1.1:49954: EOF 2018/07/23 19:34:27 http: TLS handshake error from 10.244.1.1:49960: EOF 2018/07/23 19:34:37 http: TLS handshake error from 10.244.1.1:49966: EOF 2018/07/23 19:34:47 http: TLS handshake error from 10.244.1.1:49972: EOF 2018/07/23 19:34:57 http: TLS handshake error from 10.244.1.1:49978: EOF 2018/07/23 19:35:07 http: TLS handshake error from 10.244.1.1:49984: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running level=info timestamp=2018-07-23T19:34:08.043046Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:34:09 http: TLS handshake error from 10.244.0.1:55142: EOF level=info timestamp=2018-07-23T19:34:12.087478Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:34:19 http: TLS handshake error from 10.244.0.1:55166: EOF 2018/07/23 19:34:29 http: TLS handshake error from 10.244.0.1:55190: EOF level=info timestamp=2018-07-23T19:34:38.255232Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:34:38.318532Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:34:39 http: TLS handshake error from 10.244.0.1:55216: EOF level=info timestamp=2018-07-23T19:34:42.092495Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:34:49 http: TLS handshake error from 10.244.0.1:55240: EOF level=info timestamp=2018-07-23T19:34:50.130905Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T19:34:50.147053Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:34:59 http: TLS handshake error from 10.244.0.1:55264: EOF level=info timestamp=2018-07-23T19:35:08.409603Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:35:08.475052Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T19:32:17.030528Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-7xttj Pod phase: Running level=info timestamp=2018-07-23T18:57:10.466774Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 level=info timestamp=2018-07-23T19:32:02.083967Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-07-23T19:32:02.087303Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-07-23T19:32:02.087415Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-07-23T19:32:02.087474Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-07-23T19:32:02.087516Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-07-23T19:32:02.087558Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-07-23T19:32:02.087599Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-07-23T19:32:02.103221Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-07-23T19:32:02.177906Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-07-23T19:32:02.178203Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-07-23T19:32:02.178330Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-07-23T19:32:02.178495Z pos=preset.go:71 component=virt-controller service=http msg="Starting Virtual Machine Initializer." Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T19:31:17.602862Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.633990Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.792324Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.792409Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T19:31:17.792435Z pos=vm.go:331 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-07-23T19:31:17.792807Z pos=vm.go:358 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-23T19:31:17.792854Z pos=vm.go:410 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-07-23T19:31:17.793284Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.794411Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.798501Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-23T19:31:17.800770Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T19:31:17.800876Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.800898Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:31:17.800962Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:31:17.801106Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T19:32:17.973055Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:32:17.973174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:32:17.973234Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:32:17.973274Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:32:17.973344Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T19:32:17.973431Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.066387Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:34:39.067064Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:34:39.067151Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.067759Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.068303Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.108538Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: false\n" level=info timestamp=2018-07-23T19:34:39.108707Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.108871Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.109066Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." • Failure [30.572 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should scale /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 to three, to two and then to zero replicas [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Expected error: <*errors.StatusError | 0xc4205ca2d0>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 ------------------------------ STEP: Create a new VirtualMachineInstance replica set Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 19:33:17 http: TLS handshake error from 10.244.1.1:49918: EOF 2018/07/23 19:33:27 http: TLS handshake error from 10.244.1.1:49924: EOF 2018/07/23 19:33:37 http: TLS handshake error from 10.244.1.1:49930: EOF 2018/07/23 19:33:47 http: TLS handshake error from 10.244.1.1:49936: EOF 2018/07/23 19:33:57 http: TLS handshake error from 10.244.1.1:49942: EOF 2018/07/23 19:34:07 http: TLS handshake error from 10.244.1.1:49948: EOF 2018/07/23 19:34:17 http: TLS handshake error from 10.244.1.1:49954: EOF 2018/07/23 19:34:27 http: TLS handshake error from 10.244.1.1:49960: EOF 2018/07/23 19:34:37 http: TLS handshake error from 10.244.1.1:49966: EOF 2018/07/23 19:34:47 http: TLS handshake error from 10.244.1.1:49972: EOF 2018/07/23 19:34:57 http: TLS handshake error from 10.244.1.1:49978: EOF 2018/07/23 19:35:07 http: TLS handshake error from 10.244.1.1:49984: EOF 2018/07/23 19:35:17 http: TLS handshake error from 10.244.1.1:49990: EOF 2018/07/23 19:35:27 http: TLS handshake error from 10.244.1.1:49996: EOF 2018/07/23 19:35:37 http: TLS handshake error from 10.244.1.1:50002: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running 2018/07/23 19:34:39 http: TLS handshake error from 10.244.0.1:55216: EOF level=info timestamp=2018-07-23T19:34:42.092495Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:34:49 http: TLS handshake error from 10.244.0.1:55240: EOF level=info timestamp=2018-07-23T19:34:50.130905Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T19:34:50.147053Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:34:59 http: TLS handshake error from 10.244.0.1:55264: EOF level=info timestamp=2018-07-23T19:35:08.409603Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:35:08.475052Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:35:09 http: TLS handshake error from 10.244.0.1:55288: EOF level=info timestamp=2018-07-23T19:35:12.078730Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:35:19 http: TLS handshake error from 10.244.0.1:55312: EOF 2018/07/23 19:35:29 http: TLS handshake error from 10.244.0.1:55336: EOF level=info timestamp=2018-07-23T19:35:38.580226Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:35:38.617322Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:35:39 http: TLS handshake error from 10.244.0.1:55360: EOF Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T19:32:17.030528Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-7xttj Pod phase: Running level=info timestamp=2018-07-23T18:57:10.466774Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 level=info timestamp=2018-07-23T19:32:02.083967Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-07-23T19:32:02.087303Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-07-23T19:32:02.087415Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-07-23T19:32:02.087474Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-07-23T19:32:02.087516Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-07-23T19:32:02.087558Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-07-23T19:32:02.087599Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-07-23T19:32:02.103221Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-07-23T19:32:02.177906Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-07-23T19:32:02.178203Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-07-23T19:32:02.178330Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-07-23T19:32:02.178495Z pos=preset.go:71 component=virt-controller service=http msg="Starting Virtual Machine Initializer." Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T19:31:17.602862Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.633990Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.792324Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.792409Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T19:31:17.792435Z pos=vm.go:331 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-07-23T19:31:17.792807Z pos=vm.go:358 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-23T19:31:17.792854Z pos=vm.go:410 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-07-23T19:31:17.793284Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.794411Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.798501Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-23T19:31:17.800770Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T19:31:17.800876Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.800898Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:31:17.800962Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:31:17.801106Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T19:32:17.973055Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:32:17.973174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:32:17.973234Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:32:17.973274Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:32:17.973344Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T19:32:17.973431Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.066387Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:34:39.067064Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:34:39.067151Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.067759Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.068303Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.108538Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: false\n" level=info timestamp=2018-07-23T19:34:39.108707Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.108871Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.109066Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." • Failure [30.597 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should scale /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 to five, to six and then to zero replicas [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Expected error: <*errors.StatusError | 0xc4206330e0>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 ------------------------------ STEP: Create a new VirtualMachineInstance replica set Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 19:33:47 http: TLS handshake error from 10.244.1.1:49936: EOF 2018/07/23 19:33:57 http: TLS handshake error from 10.244.1.1:49942: EOF 2018/07/23 19:34:07 http: TLS handshake error from 10.244.1.1:49948: EOF 2018/07/23 19:34:17 http: TLS handshake error from 10.244.1.1:49954: EOF 2018/07/23 19:34:27 http: TLS handshake error from 10.244.1.1:49960: EOF 2018/07/23 19:34:37 http: TLS handshake error from 10.244.1.1:49966: EOF 2018/07/23 19:34:47 http: TLS handshake error from 10.244.1.1:49972: EOF 2018/07/23 19:34:57 http: TLS handshake error from 10.244.1.1:49978: EOF 2018/07/23 19:35:07 http: TLS handshake error from 10.244.1.1:49984: EOF 2018/07/23 19:35:17 http: TLS handshake error from 10.244.1.1:49990: EOF 2018/07/23 19:35:27 http: TLS handshake error from 10.244.1.1:49996: EOF 2018/07/23 19:35:37 http: TLS handshake error from 10.244.1.1:50002: EOF 2018/07/23 19:35:47 http: TLS handshake error from 10.244.1.1:50008: EOF 2018/07/23 19:35:57 http: TLS handshake error from 10.244.1.1:50014: EOF 2018/07/23 19:36:07 http: TLS handshake error from 10.244.1.1:50020: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running level=info timestamp=2018-07-23T19:35:08.409603Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:35:08.475052Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:35:09 http: TLS handshake error from 10.244.0.1:55288: EOF level=info timestamp=2018-07-23T19:35:12.078730Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:35:19 http: TLS handshake error from 10.244.0.1:55312: EOF 2018/07/23 19:35:29 http: TLS handshake error from 10.244.0.1:55336: EOF level=info timestamp=2018-07-23T19:35:38.580226Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:35:38.617322Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:35:39 http: TLS handshake error from 10.244.0.1:55360: EOF level=info timestamp=2018-07-23T19:35:42.090459Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:35:49 http: TLS handshake error from 10.244.0.1:55384: EOF 2018/07/23 19:35:59 http: TLS handshake error from 10.244.0.1:55408: EOF level=info timestamp=2018-07-23T19:36:08.766189Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:36:08.774069Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:36:09 http: TLS handshake error from 10.244.0.1:55432: EOF Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T19:32:17.030528Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-7xttj Pod phase: Running level=info timestamp=2018-07-23T18:57:10.466774Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 level=info timestamp=2018-07-23T19:32:02.083967Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-07-23T19:32:02.087303Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-07-23T19:32:02.087415Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-07-23T19:32:02.087474Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-07-23T19:32:02.087516Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-07-23T19:32:02.087558Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-07-23T19:32:02.087599Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-07-23T19:32:02.103221Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-07-23T19:32:02.177906Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-07-23T19:32:02.178203Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-07-23T19:32:02.178330Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-07-23T19:32:02.178495Z pos=preset.go:71 component=virt-controller service=http msg="Starting Virtual Machine Initializer." Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T19:31:17.602862Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.633990Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.792324Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.792409Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T19:31:17.792435Z pos=vm.go:331 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-07-23T19:31:17.792807Z pos=vm.go:358 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-23T19:31:17.792854Z pos=vm.go:410 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-07-23T19:31:17.793284Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.794411Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.798501Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-23T19:31:17.800770Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T19:31:17.800876Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.800898Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:31:17.800962Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:31:17.801106Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T19:32:17.973055Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:32:17.973174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:32:17.973234Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:32:17.973274Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:32:17.973344Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T19:32:17.973431Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.066387Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:34:39.067064Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:34:39.067151Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.067759Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.068303Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.108538Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: false\n" level=info timestamp=2018-07-23T19:34:39.108707Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.108871Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.109066Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." • Failure [30.532 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should be rejected on POST if spec is invalid [It] /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:107 Expected error: <*errors.StatusError | 0xc4201366c0>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 ------------------------------ STEP: Create a new VirtualMachineInstance replica set Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 19:34:17 http: TLS handshake error from 10.244.1.1:49954: EOF 2018/07/23 19:34:27 http: TLS handshake error from 10.244.1.1:49960: EOF 2018/07/23 19:34:37 http: TLS handshake error from 10.244.1.1:49966: EOF 2018/07/23 19:34:47 http: TLS handshake error from 10.244.1.1:49972: EOF 2018/07/23 19:34:57 http: TLS handshake error from 10.244.1.1:49978: EOF 2018/07/23 19:35:07 http: TLS handshake error from 10.244.1.1:49984: EOF 2018/07/23 19:35:17 http: TLS handshake error from 10.244.1.1:49990: EOF 2018/07/23 19:35:27 http: TLS handshake error from 10.244.1.1:49996: EOF 2018/07/23 19:35:37 http: TLS handshake error from 10.244.1.1:50002: EOF 2018/07/23 19:35:47 http: TLS handshake error from 10.244.1.1:50008: EOF 2018/07/23 19:35:57 http: TLS handshake error from 10.244.1.1:50014: EOF 2018/07/23 19:36:07 http: TLS handshake error from 10.244.1.1:50020: EOF 2018/07/23 19:36:17 http: TLS handshake error from 10.244.1.1:50026: EOF 2018/07/23 19:36:27 http: TLS handshake error from 10.244.1.1:50032: EOF 2018/07/23 19:36:37 http: TLS handshake error from 10.244.1.1:50038: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running level=info timestamp=2018-07-23T19:35:38.580226Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:35:38.617322Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:35:39 http: TLS handshake error from 10.244.0.1:55360: EOF level=info timestamp=2018-07-23T19:35:42.090459Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:35:49 http: TLS handshake error from 10.244.0.1:55384: EOF 2018/07/23 19:35:59 http: TLS handshake error from 10.244.0.1:55408: EOF level=info timestamp=2018-07-23T19:36:08.766189Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:36:08.774069Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:36:09 http: TLS handshake error from 10.244.0.1:55432: EOF level=info timestamp=2018-07-23T19:36:12.050082Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:36:19 http: TLS handshake error from 10.244.0.1:55456: EOF 2018/07/23 19:36:29 http: TLS handshake error from 10.244.0.1:55480: EOF level=info timestamp=2018-07-23T19:36:38.921130Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:36:38.925417Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:36:39 http: TLS handshake error from 10.244.0.1:55504: EOF Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T19:32:17.030528Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-7xttj Pod phase: Running level=info timestamp=2018-07-23T18:57:10.466774Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 level=info timestamp=2018-07-23T19:32:02.083967Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-07-23T19:32:02.087303Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-07-23T19:32:02.087415Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-07-23T19:32:02.087474Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-07-23T19:32:02.087516Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-07-23T19:32:02.087558Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-07-23T19:32:02.087599Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-07-23T19:32:02.103221Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-07-23T19:32:02.177906Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-07-23T19:32:02.178203Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-07-23T19:32:02.178330Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-07-23T19:32:02.178495Z pos=preset.go:71 component=virt-controller service=http msg="Starting Virtual Machine Initializer." Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T19:31:17.602862Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.633990Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.792324Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.792409Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T19:31:17.792435Z pos=vm.go:331 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-07-23T19:31:17.792807Z pos=vm.go:358 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-23T19:31:17.792854Z pos=vm.go:410 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-07-23T19:31:17.793284Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.794411Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.798501Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-23T19:31:17.800770Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T19:31:17.800876Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.800898Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:31:17.800962Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:31:17.801106Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T19:32:17.973055Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:32:17.973174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:32:17.973234Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:32:17.973274Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:32:17.973344Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T19:32:17.973431Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.066387Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:34:39.067064Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:34:39.067151Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.067759Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.068303Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.108538Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: false\n" level=info timestamp=2018-07-23T19:34:39.108707Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.108871Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.109066Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." • Failure [30.518 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should reject POST if validation webhoook deems the spec is invalid [It] /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:128 Expected error: <*errors.StatusError | 0xc420137dd0>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 ------------------------------ STEP: Create a new VirtualMachineInstance replica set Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 19:34:47 http: TLS handshake error from 10.244.1.1:49972: EOF 2018/07/23 19:34:57 http: TLS handshake error from 10.244.1.1:49978: EOF 2018/07/23 19:35:07 http: TLS handshake error from 10.244.1.1:49984: EOF 2018/07/23 19:35:17 http: TLS handshake error from 10.244.1.1:49990: EOF 2018/07/23 19:35:27 http: TLS handshake error from 10.244.1.1:49996: EOF 2018/07/23 19:35:37 http: TLS handshake error from 10.244.1.1:50002: EOF 2018/07/23 19:35:47 http: TLS handshake error from 10.244.1.1:50008: EOF 2018/07/23 19:35:57 http: TLS handshake error from 10.244.1.1:50014: EOF 2018/07/23 19:36:07 http: TLS handshake error from 10.244.1.1:50020: EOF 2018/07/23 19:36:17 http: TLS handshake error from 10.244.1.1:50026: EOF 2018/07/23 19:36:27 http: TLS handshake error from 10.244.1.1:50032: EOF 2018/07/23 19:36:37 http: TLS handshake error from 10.244.1.1:50038: EOF 2018/07/23 19:36:47 http: TLS handshake error from 10.244.1.1:50044: EOF 2018/07/23 19:36:57 http: TLS handshake error from 10.244.1.1:50050: EOF 2018/07/23 19:37:07 http: TLS handshake error from 10.244.1.1:50056: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running 2018/07/23 19:36:09 http: TLS handshake error from 10.244.0.1:55432: EOF level=info timestamp=2018-07-23T19:36:12.050082Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:36:19 http: TLS handshake error from 10.244.0.1:55456: EOF 2018/07/23 19:36:29 http: TLS handshake error from 10.244.0.1:55480: EOF level=info timestamp=2018-07-23T19:36:38.921130Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:36:38.925417Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:36:39 http: TLS handshake error from 10.244.0.1:55504: EOF level=info timestamp=2018-07-23T19:36:42.025022Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:36:49 http: TLS handshake error from 10.244.0.1:55528: EOF level=info timestamp=2018-07-23T19:36:50.151301Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T19:36:50.154087Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:36:59 http: TLS handshake error from 10.244.0.1:55552: EOF level=info timestamp=2018-07-23T19:37:09.084321Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:37:09.094841Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:37:09 http: TLS handshake error from 10.244.0.1:55576: EOF Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T19:32:17.030528Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-7xttj Pod phase: Running level=info timestamp=2018-07-23T18:57:10.466774Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 level=info timestamp=2018-07-23T19:32:02.083967Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-07-23T19:32:02.087303Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-07-23T19:32:02.087415Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-07-23T19:32:02.087474Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-07-23T19:32:02.087516Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-07-23T19:32:02.087558Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-07-23T19:32:02.087599Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-07-23T19:32:02.103221Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-07-23T19:32:02.177906Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-07-23T19:32:02.178203Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-07-23T19:32:02.178330Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-07-23T19:32:02.178495Z pos=preset.go:71 component=virt-controller service=http msg="Starting Virtual Machine Initializer." Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T19:31:17.602862Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.633990Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.792324Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.792409Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T19:31:17.792435Z pos=vm.go:331 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-07-23T19:31:17.792807Z pos=vm.go:358 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-23T19:31:17.792854Z pos=vm.go:410 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-07-23T19:31:17.793284Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.794411Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.798501Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-23T19:31:17.800770Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T19:31:17.800876Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.800898Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:31:17.800962Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:31:17.801106Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T19:32:17.973055Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:32:17.973174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:32:17.973234Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:32:17.973274Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:32:17.973344Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T19:32:17.973431Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.066387Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:34:39.067064Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:34:39.067151Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.067759Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.068303Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.108538Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: false\n" level=info timestamp=2018-07-23T19:34:39.108707Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.108871Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.109066Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." • Failure [30.487 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should update readyReplicas once VMIs are up [It] /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:157 Expected error: <*errors.StatusError | 0xc42016bc20>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 ------------------------------ STEP: Create a new VirtualMachineInstance replica set Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 19:35:17 http: TLS handshake error from 10.244.1.1:49990: EOF 2018/07/23 19:35:27 http: TLS handshake error from 10.244.1.1:49996: EOF 2018/07/23 19:35:37 http: TLS handshake error from 10.244.1.1:50002: EOF 2018/07/23 19:35:47 http: TLS handshake error from 10.244.1.1:50008: EOF 2018/07/23 19:35:57 http: TLS handshake error from 10.244.1.1:50014: EOF 2018/07/23 19:36:07 http: TLS handshake error from 10.244.1.1:50020: EOF 2018/07/23 19:36:17 http: TLS handshake error from 10.244.1.1:50026: EOF 2018/07/23 19:36:27 http: TLS handshake error from 10.244.1.1:50032: EOF 2018/07/23 19:36:37 http: TLS handshake error from 10.244.1.1:50038: EOF 2018/07/23 19:36:47 http: TLS handshake error from 10.244.1.1:50044: EOF 2018/07/23 19:36:57 http: TLS handshake error from 10.244.1.1:50050: EOF 2018/07/23 19:37:07 http: TLS handshake error from 10.244.1.1:50056: EOF 2018/07/23 19:37:17 http: TLS handshake error from 10.244.1.1:50062: EOF 2018/07/23 19:37:27 http: TLS handshake error from 10.244.1.1:50068: EOF 2018/07/23 19:37:37 http: TLS handshake error from 10.244.1.1:50074: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running 2018/07/23 19:36:39 http: TLS handshake error from 10.244.0.1:55504: EOF level=info timestamp=2018-07-23T19:36:42.025022Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:36:49 http: TLS handshake error from 10.244.0.1:55528: EOF level=info timestamp=2018-07-23T19:36:50.151301Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T19:36:50.154087Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:36:59 http: TLS handshake error from 10.244.0.1:55552: EOF level=info timestamp=2018-07-23T19:37:09.084321Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:37:09.094841Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:37:09 http: TLS handshake error from 10.244.0.1:55576: EOF level=info timestamp=2018-07-23T19:37:12.113926Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:37:19 http: TLS handshake error from 10.244.0.1:55600: EOF 2018/07/23 19:37:29 http: TLS handshake error from 10.244.0.1:55624: EOF level=info timestamp=2018-07-23T19:37:39.311000Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:37:39.333837Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:37:39 http: TLS handshake error from 10.244.0.1:55648: EOF Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T19:32:17.030528Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-7xttj Pod phase: Running level=info timestamp=2018-07-23T18:57:10.466774Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 level=info timestamp=2018-07-23T19:32:02.083967Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-07-23T19:32:02.087303Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-07-23T19:32:02.087415Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-07-23T19:32:02.087474Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-07-23T19:32:02.087516Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-07-23T19:32:02.087558Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-07-23T19:32:02.087599Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-07-23T19:32:02.103221Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-07-23T19:32:02.177906Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-07-23T19:32:02.178203Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-07-23T19:32:02.178330Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-07-23T19:32:02.178495Z pos=preset.go:71 component=virt-controller service=http msg="Starting Virtual Machine Initializer." Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T19:31:17.602862Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.633990Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.792324Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.792409Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T19:31:17.792435Z pos=vm.go:331 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-07-23T19:31:17.792807Z pos=vm.go:358 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-23T19:31:17.792854Z pos=vm.go:410 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-07-23T19:31:17.793284Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.794411Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.798501Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-23T19:31:17.800770Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T19:31:17.800876Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.800898Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:31:17.800962Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:31:17.801106Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T19:32:17.973055Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:32:17.973174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:32:17.973234Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:32:17.973274Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:32:17.973344Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T19:32:17.973431Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.066387Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:34:39.067064Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:34:39.067151Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.067759Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.068303Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.108538Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: false\n" level=info timestamp=2018-07-23T19:34:39.108707Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.108871Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.109066Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." • Failure [30.432 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should remove VMIs once it is marked for deletion [It] /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:169 Expected error: <*errors.StatusError | 0xc4206330e0>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 ------------------------------ STEP: Create a new VirtualMachineInstance replica set Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 19:35:47 http: TLS handshake error from 10.244.1.1:50008: EOF 2018/07/23 19:35:57 http: TLS handshake error from 10.244.1.1:50014: EOF 2018/07/23 19:36:07 http: TLS handshake error from 10.244.1.1:50020: EOF 2018/07/23 19:36:17 http: TLS handshake error from 10.244.1.1:50026: EOF 2018/07/23 19:36:27 http: TLS handshake error from 10.244.1.1:50032: EOF 2018/07/23 19:36:37 http: TLS handshake error from 10.244.1.1:50038: EOF 2018/07/23 19:36:47 http: TLS handshake error from 10.244.1.1:50044: EOF 2018/07/23 19:36:57 http: TLS handshake error from 10.244.1.1:50050: EOF 2018/07/23 19:37:07 http: TLS handshake error from 10.244.1.1:50056: EOF 2018/07/23 19:37:17 http: TLS handshake error from 10.244.1.1:50062: EOF 2018/07/23 19:37:27 http: TLS handshake error from 10.244.1.1:50068: EOF 2018/07/23 19:37:37 http: TLS handshake error from 10.244.1.1:50074: EOF 2018/07/23 19:37:47 http: TLS handshake error from 10.244.1.1:50080: EOF 2018/07/23 19:37:57 http: TLS handshake error from 10.244.1.1:50086: EOF 2018/07/23 19:38:07 http: TLS handshake error from 10.244.1.1:50092: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running level=info timestamp=2018-07-23T19:37:09.094841Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:37:09 http: TLS handshake error from 10.244.0.1:55576: EOF level=info timestamp=2018-07-23T19:37:12.113926Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:37:19 http: TLS handshake error from 10.244.0.1:55600: EOF 2018/07/23 19:37:29 http: TLS handshake error from 10.244.0.1:55624: EOF level=info timestamp=2018-07-23T19:37:39.311000Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:37:39.333837Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/23 19:37:39 http: TLS handshake error from 10.244.0.1:55648: EOF level=info timestamp=2018-07-23T19:37:42.065425Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:37:49 http: TLS handshake error from 10.244.0.1:55672: EOF 2018/07/23 19:37:59 http: TLS handshake error from 10.244.0.1:55696: EOF 2018/07/23 19:38:09 http: TLS handshake error from 10.244.0.1:55720: EOF level=info timestamp=2018-07-23T19:38:09.477865Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:38:09.507416Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:38:12.193396Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T19:32:17.030528Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-7xttj Pod phase: Running level=info timestamp=2018-07-23T18:57:10.466774Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 level=info timestamp=2018-07-23T19:32:02.083967Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-07-23T19:32:02.087303Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-07-23T19:32:02.087415Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-07-23T19:32:02.087474Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-07-23T19:32:02.087516Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-07-23T19:32:02.087558Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-07-23T19:32:02.087599Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-07-23T19:32:02.103221Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-07-23T19:32:02.177906Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-07-23T19:32:02.178203Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-07-23T19:32:02.178330Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-07-23T19:32:02.178495Z pos=preset.go:71 component=virt-controller service=http msg="Starting Virtual Machine Initializer." Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T19:31:17.602862Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.633990Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.792324Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.792409Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T19:31:17.792435Z pos=vm.go:331 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-07-23T19:31:17.792807Z pos=vm.go:358 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-23T19:31:17.792854Z pos=vm.go:410 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-07-23T19:31:17.793284Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.794411Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.798501Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-23T19:31:17.800770Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T19:31:17.800876Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.800898Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:31:17.800962Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:31:17.801106Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T19:32:17.973055Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:32:17.973174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:32:17.973234Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:32:17.973274Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:32:17.973344Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T19:32:17.973431Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.066387Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:34:39.067064Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:34:39.067151Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.067759Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.068303Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.108538Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: false\n" level=info timestamp=2018-07-23T19:34:39.108707Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.108871Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.109066Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." • Failure [30.547 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should remove owner references on the VirtualMachineInstance if it is orphan deleted [It] /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:185 Expected error: <*errors.StatusError | 0xc4201367e0>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 ------------------------------ STEP: Create a new VirtualMachineInstance replica set Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 19:36:17 http: TLS handshake error from 10.244.1.1:50026: EOF 2018/07/23 19:36:27 http: TLS handshake error from 10.244.1.1:50032: EOF 2018/07/23 19:36:37 http: TLS handshake error from 10.244.1.1:50038: EOF 2018/07/23 19:36:47 http: TLS handshake error from 10.244.1.1:50044: EOF 2018/07/23 19:36:57 http: TLS handshake error from 10.244.1.1:50050: EOF 2018/07/23 19:37:07 http: TLS handshake error from 10.244.1.1:50056: EOF 2018/07/23 19:37:17 http: TLS handshake error from 10.244.1.1:50062: EOF 2018/07/23 19:37:27 http: TLS handshake error from 10.244.1.1:50068: EOF 2018/07/23 19:37:37 http: TLS handshake error from 10.244.1.1:50074: EOF 2018/07/23 19:37:47 http: TLS handshake error from 10.244.1.1:50080: EOF 2018/07/23 19:37:57 http: TLS handshake error from 10.244.1.1:50086: EOF 2018/07/23 19:38:07 http: TLS handshake error from 10.244.1.1:50092: EOF 2018/07/23 19:38:17 http: TLS handshake error from 10.244.1.1:50098: EOF 2018/07/23 19:38:27 http: TLS handshake error from 10.244.1.1:50104: EOF 2018/07/23 19:38:37 http: TLS handshake error from 10.244.1.1:50110: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running level=info timestamp=2018-07-23T19:37:42.065425Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:37:49 http: TLS handshake error from 10.244.0.1:55672: EOF 2018/07/23 19:37:59 http: TLS handshake error from 10.244.0.1:55696: EOF 2018/07/23 19:38:09 http: TLS handshake error from 10.244.0.1:55720: EOF level=info timestamp=2018-07-23T19:38:09.477865Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:38:09.507416Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:38:12.193396Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:38:19 http: TLS handshake error from 10.244.0.1:55744: EOF 2018/07/23 19:38:29 http: TLS handshake error from 10.244.0.1:55768: EOF 2018/07/23 19:38:39 http: TLS handshake error from 10.244.0.1:55792: EOF level=info timestamp=2018-07-23T19:38:39.664893Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:38:39.720594Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:38:41.771609Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T19:38:41.791888Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T19:38:42.474231Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T19:32:17.030528Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-7xttj Pod phase: Running level=info timestamp=2018-07-23T18:57:10.466774Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 level=info timestamp=2018-07-23T19:32:02.083967Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-07-23T19:32:02.087303Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-07-23T19:32:02.087415Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-07-23T19:32:02.087474Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-07-23T19:32:02.087516Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-07-23T19:32:02.087558Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-07-23T19:32:02.087599Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-07-23T19:32:02.103221Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-07-23T19:32:02.177906Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-07-23T19:32:02.178203Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-07-23T19:32:02.178330Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-07-23T19:32:02.178495Z pos=preset.go:71 component=virt-controller service=http msg="Starting Virtual Machine Initializer." Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T19:31:17.602862Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.633990Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.792324Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.792409Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T19:31:17.792435Z pos=vm.go:331 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-07-23T19:31:17.792807Z pos=vm.go:358 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-23T19:31:17.792854Z pos=vm.go:410 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-07-23T19:31:17.793284Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.794411Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.798501Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-23T19:31:17.800770Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T19:31:17.800876Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.800898Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:31:17.800962Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:31:17.801106Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T19:32:17.973055Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:32:17.973174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:32:17.973234Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:32:17.973274Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:32:17.973344Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T19:32:17.973431Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.066387Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:34:39.067064Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:34:39.067151Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.067759Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.068303Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.108538Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: false\n" level=info timestamp=2018-07-23T19:34:39.108707Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.108871Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.109066Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." • Failure [30.403 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should not scale when paused and scale when resume [It] /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:223 Expected error: <*errors.StatusError | 0xc420e2c630>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 ------------------------------ STEP: Create a new VirtualMachineInstance replica set Pod name: disks-images-provider-jcfpz Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jr7kq Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-h8lwp Pod phase: Running 2018/07/23 19:36:47 http: TLS handshake error from 10.244.1.1:50044: EOF 2018/07/23 19:36:57 http: TLS handshake error from 10.244.1.1:50050: EOF 2018/07/23 19:37:07 http: TLS handshake error from 10.244.1.1:50056: EOF 2018/07/23 19:37:17 http: TLS handshake error from 10.244.1.1:50062: EOF 2018/07/23 19:37:27 http: TLS handshake error from 10.244.1.1:50068: EOF 2018/07/23 19:37:37 http: TLS handshake error from 10.244.1.1:50074: EOF 2018/07/23 19:37:47 http: TLS handshake error from 10.244.1.1:50080: EOF 2018/07/23 19:37:57 http: TLS handshake error from 10.244.1.1:50086: EOF 2018/07/23 19:38:07 http: TLS handshake error from 10.244.1.1:50092: EOF 2018/07/23 19:38:17 http: TLS handshake error from 10.244.1.1:50098: EOF 2018/07/23 19:38:27 http: TLS handshake error from 10.244.1.1:50104: EOF 2018/07/23 19:38:37 http: TLS handshake error from 10.244.1.1:50110: EOF 2018/07/23 19:38:47 http: TLS handshake error from 10.244.1.1:50116: EOF 2018/07/23 19:38:57 http: TLS handshake error from 10.244.1.1:50122: EOF 2018/07/23 19:39:07 http: TLS handshake error from 10.244.1.1:50128: EOF Pod name: virt-api-7d79764579-wccxk Pod phase: Running 2018/07/23 19:38:29 http: TLS handshake error from 10.244.0.1:55768: EOF 2018/07/23 19:38:39 http: TLS handshake error from 10.244.0.1:55792: EOF level=info timestamp=2018-07-23T19:38:39.664893Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:38:39.720594Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:38:41.771609Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T19:38:41.791888Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T19:38:42.474231Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:38:49 http: TLS handshake error from 10.244.0.1:55816: EOF level=info timestamp=2018-07-23T19:38:49.892465Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-23T19:38:49.895436Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/23 19:38:59 http: TLS handshake error from 10.244.0.1:55840: EOF 2018/07/23 19:39:09 http: TLS handshake error from 10.244.0.1:55864: EOF level=info timestamp=2018-07-23T19:39:09.723758Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:39:09.768841Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-23T19:39:12.035688Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6l7pp Pod phase: Running level=info timestamp=2018-07-23T19:32:17.030528Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-7xttj Pod phase: Running level=info timestamp=2018-07-23T18:57:10.466774Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 level=info timestamp=2018-07-23T19:32:02.083967Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-07-23T19:32:02.087303Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-07-23T19:32:02.087415Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-07-23T19:32:02.087474Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-07-23T19:32:02.087516Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-07-23T19:32:02.087558Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-07-23T19:32:02.087599Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-07-23T19:32:02.103221Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-07-23T19:32:02.177906Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-07-23T19:32:02.178203Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-07-23T19:32:02.178330Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-07-23T19:32:02.178495Z pos=preset.go:71 component=virt-controller service=http msg="Starting Virtual Machine Initializer." Pod name: virt-handler-j242z Pod phase: Running level=info timestamp=2018-07-23T19:31:17.602862Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.633990Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind= uid=d7aab501-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.792324Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.792409Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-23T19:31:17.792435Z pos=vm.go:331 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-07-23T19:31:17.792807Z pos=vm.go:358 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-23T19:31:17.792854Z pos=vm.go:410 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-07-23T19:31:17.793284Z pos=vm.go:585 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmir72t4" level=info timestamp=2018-07-23T19:31:17.794411Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:31:17.798501Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-23T19:31:17.800770Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-23T19:31:17.800876Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmir72t4, existing: false\n" level=info timestamp=2018-07-23T19:31:17.800898Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:31:17.800962Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:31:17.801106Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmir72t4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-xkbfk Pod phase: Running level=info timestamp=2018-07-23T19:32:17.973055Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:32:17.973174Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:32:17.973234Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:32:17.973274Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:32:17.973344Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="No update processing required" level=info timestamp=2018-07-23T19:32:17.973431Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.066387Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: true\n" level=info timestamp=2018-07-23T19:34:39.067064Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-23T19:34:39.067151Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.067759Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.068303Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind= uid=f99df072-8eae-11e8-9512-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-23T19:34:39.108538Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi9gm2g, existing: false\n" level=info timestamp=2018-07-23T19:34:39.108707Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-23T19:34:39.108871Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-23T19:34:39.109066Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi9gm2g kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." • Failure [30.568 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should remove the finished VM [It] /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:279 Expected error: <*errors.StatusError | 0xc420efa1b0>: { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: {SelfLink: "", ResourceVersion: "", Continue: ""}, Status: "Failure", Message: "Timeout: request did not complete within allowed duration", Reason: "Timeout", Details: {Name: "", Group: "", Kind: "", UID: "", Causes: nil, RetryAfterSeconds: 0}, Code: 504, }, } Timeout: request did not complete within allowed duration not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 ------------------------------ STEP: Creating new replica set STEP: Create a new VirtualMachineInstance replica set Waiting for namespace kubevirt-test-default to be removed, this can take a while ... Waiting for namespace kubevirt-test-alternative to be removed, this can take a while ... Summarizing 22 Failures: [Fail] VMIlifecycle Killed VirtualMachineInstance [It] should be left alone by virt-handler /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:955 [Fail] Storage Starting a VirtualMachineInstance with Alpine PVC should be successfully started [It] with Disk PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 [Fail] Storage Starting a VirtualMachineInstance with Alpine PVC should be successfully started [It] with CDRom PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 [Fail] Storage Starting a VirtualMachineInstance with Alpine PVC should be successfully started and stopped multiple times [It] with Disk PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 [Fail] Storage Starting a VirtualMachineInstance with Alpine PVC should be successfully started and stopped multiple times [It] with CDRom PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 [Fail] Storage Starting a VirtualMachineInstance With an emptyDisk defined [It] should create a writeable emptyDisk with the right capacity /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 [Fail] Storage Starting a VirtualMachineInstance With an emptyDisk defined and a specified serial number [It] should create a writeable emptyDisk with the specified serial number /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 [Fail] Storage Starting a VirtualMachineInstance With ephemeral alpine PVC [It] should be successfully started /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 [Fail] Storage Starting a VirtualMachineInstance With ephemeral alpine PVC [It] should not persist data /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 [Fail] Storage Starting a VirtualMachineInstance With VirtualMachineInstance with two PVCs [It] should start vmi multiple times /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:64 [Fail] Expose Expose service on a VM [BeforeEach] Expose ClusterIP service Should expose a Cluster IP service on a VM and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:27 [Fail] Expose Expose service on a VM Expose NodePort service [It] Should expose a NodePort service on a VM and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:134 [Fail] Health Monitoring A VirtualMachineInstance with a watchdog device [It] should be shut down when the watchdog expires /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:64 [Fail] VirtualMachineInstanceReplicaSet should scale [It] to three, to two and then to zero replicas /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 [Fail] VirtualMachineInstanceReplicaSet should scale [It] to five, to six and then to zero replicas /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 [Fail] VirtualMachineInstanceReplicaSet [It] should be rejected on POST if spec is invalid /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 [Fail] VirtualMachineInstanceReplicaSet [It] should reject POST if validation webhoook deems the spec is invalid /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 [Fail] VirtualMachineInstanceReplicaSet [It] should update readyReplicas once VMIs are up /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 [Fail] VirtualMachineInstanceReplicaSet [It] should remove VMIs once it is marked for deletion /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 [Fail] VirtualMachineInstanceReplicaSet [It] should remove owner references on the VirtualMachineInstance if it is orphan deleted /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 [Fail] VirtualMachineInstanceReplicaSet [It] should not scale when paused and scale when resume /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 [Fail] VirtualMachineInstanceReplicaSet [It] should remove the finished VM /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:92 Ran 130 of 144 Specs in 5162.817 seconds FAIL! -- 108 Passed | 22 Failed | 0 Pending | 14 Skipped --- FAIL: TestTests (5162.83s) FAIL make: *** [functest] Error 1 + make cluster-down ./cluster/down.sh