+ export WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release + WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release + [[ k8s-1.11.0-release =~ openshift-.* ]] + [[ k8s-1.11.0-release =~ .*-1.9.3-.* ]] + export KUBEVIRT_PROVIDER=k8s-1.10.3 + KUBEVIRT_PROVIDER=k8s-1.10.3 + export KUBEVIRT_NUM_NODES=2 + KUBEVIRT_NUM_NODES=2 + export NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + export NAMESPACE=kube-system + NAMESPACE=kube-system + trap '{ make cluster-down; }' EXIT SIGINT SIGTERM SIGSTOP + make cluster-down ./cluster/down.sh + make cluster-up ./cluster/up.sh Downloading ....... Downloading ....... 2018/07/27 06:18:13 Waiting for host: 192.168.66.101:22 2018/07/27 06:18:16 Problem with dial: dial tcp 192.168.66.101:22: getsockopt: no route to host. Sleeping 5s 2018/07/27 06:18:29 Connected to tcp://192.168.66.101:22 + kubeadm init --config /etc/kubernetes/kubeadm.conf [init] Using Kubernetes version: v1.10.3 [init] Using Authorization modes: [Node RBAC] [preflight] Running pre-flight checks. [WARNING FileExisting-crictl]: crictl not found in system path Suggestion: go get github.com/kubernetes-incubator/cri-tools/cmd/crictl Flag --admission-control has been deprecated, Use --enable-admission-plugins or --disable-admission-plugins instead. Will be removed in a future version. [certificates] Generated ca certificate and key. [certificates] Generated apiserver certificate and key. [certificates] apiserver serving cert is signed for DNS names [node01 kubernetes kubernetes.default kubernetes.default.svc kubernetes.default.svc.cluster.local] and IPs [10.96.0.1 192.168.66.101] [certificates] Generated apiserver-kubelet-client certificate and key. [certificates] Generated etcd/ca certificate and key. [certificates] Generated etcd/server certificate and key. [certificates] etcd/server serving cert is signed for DNS names [localhost] and IPs [127.0.0.1] [certificates] Generated etcd/peer certificate and key. [certificates] etcd/peer serving cert is signed for DNS names [node01] and IPs [192.168.66.101] [certificates] Generated etcd/healthcheck-client certificate and key. [certificates] Generated apiserver-etcd-client certificate and key. [certificates] Generated sa key and public key. [certificates] Generated front-proxy-ca certificate and key. [certificates] Generated front-proxy-client certificate and key. [certificates] Valid certificates and keys now exist in "/etc/kubernetes/pki" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/admin.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/kubelet.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/controller-manager.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/scheduler.conf" [controlplane] Wrote Static Pod manifest for component kube-apiserver to "/etc/kubernetes/manifests/kube-apiserver.yaml" [controlplane] Wrote Static Pod manifest for component kube-controller-manager to "/etc/kubernetes/manifests/kube-controller-manager.yaml" [controlplane] Wrote Static Pod manifest for component kube-scheduler to "/etc/kubernetes/manifests/kube-scheduler.yaml" [etcd] Wrote Static Pod manifest for a local etcd instance to "/etc/kubernetes/manifests/etcd.yaml" [init] Waiting for the kubelet to boot up the control plane as Static Pods from directory "/etc/kubernetes/manifests". [init] This might take a minute or longer if the control plane images have to be pulled. [apiclient] All control plane components are healthy after 25.003571 seconds [uploadconfig] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace [markmaster] Will mark node node01 as master by adding a label and a taint [markmaster] Master node01 tainted and labelled with key/value: node-role.kubernetes.io/master="" [bootstraptoken] Using token: abcdef.1234567890123456 [bootstraptoken] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials [bootstraptoken] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token [bootstraptoken] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster [bootstraptoken] Creating the "cluster-info" ConfigMap in the "kube-public" namespace [addons] Applied essential addon: kube-dns [addons] Applied essential addon: kube-proxy Your Kubernetes master has initialized successfully! To start using your cluster, you need to run the following as a regular user: mkdir -p $HOME/.kube sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config sudo chown $(id -u):$(id -g) $HOME/.kube/config You should now deploy a pod network to the cluster. Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at: https://kubernetes.io/docs/concepts/cluster-administration/addons/ You can now join any number of machines by running the following on each node as root: kubeadm join 192.168.66.101:6443 --token abcdef.1234567890123456 --discovery-token-ca-cert-hash sha256:c57f3a65b217d47a56bdeed0a65f48c6edff1bf69c56abc9299f689a7f46ea69 + kubectl --kubeconfig=/etc/kubernetes/admin.conf apply -f https://raw.githubusercontent.com/coreos/flannel/v0.9.1/Documentation/kube-flannel.yml clusterrole.rbac.authorization.k8s.io "flannel" created clusterrolebinding.rbac.authorization.k8s.io "flannel" created serviceaccount "flannel" created configmap "kube-flannel-cfg" created daemonset.extensions "kube-flannel-ds" created + kubectl --kubeconfig=/etc/kubernetes/admin.conf taint nodes node01 node-role.kubernetes.io/master:NoSchedule- node "node01" untainted 2018/07/27 06:19:11 Waiting for host: 192.168.66.102:22 2018/07/27 06:19:14 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: no route to host. Sleeping 5s 2018/07/27 06:19:26 Connected to tcp://192.168.66.102:22 + kubeadm join --token abcdef.1234567890123456 192.168.66.101:6443 --ignore-preflight-errors=all --discovery-token-unsafe-skip-ca-verification=true [preflight] Running pre-flight checks. [discovery] Trying to connect to API Server "192.168.66.101:6443" [discovery] Created cluster-info discovery client, requesting info from "https://192.168.66.101:6443" [WARNING FileExisting-crictl]: crictl not found in system path Suggestion: go get github.com/kubernetes-incubator/cri-tools/cmd/crictl [discovery] Cluster info signature and contents are valid and no TLS pinning was specified, will use API Server "192.168.66.101:6443" [discovery] Successfully established connection with API Server "192.168.66.101:6443" This node has joined the cluster: * Certificate signing request was sent to master and a response was received. * The Kubelet was informed of the new secure connection details. Run 'kubectl get nodes' on the master to see this node join the cluster. Sending file modes: C0755 39588992 kubectl Sending file modes: C0600 5454 admin.conf Cluster "kubernetes" set. Cluster "kubernetes" set. + set +e + kubectl get nodes --no-headers + cluster/kubectl.sh get nodes --no-headers node01 Ready master 35s v1.10.3 node02 NotReady 9s v1.10.3 + kubectl_rc=0 + '[' 0 -ne 0 ']' ++ kubectl get nodes --no-headers ++ grep NotReady ++ cluster/kubectl.sh get nodes --no-headers + '[' -n 'node02 NotReady 10s v1.10.3' ']' + echo 'Waiting for all nodes to become ready ...' Waiting for all nodes to become ready ... + kubectl get nodes --no-headers + cluster/kubectl.sh get nodes --no-headers node01 Ready master 36s v1.10.3 node02 NotReady 10s v1.10.3 + kubectl_rc=0 + sleep 10 + '[' 0 -ne 0 ']' ++ kubectl get nodes --no-headers ++ grep NotReady ++ cluster/kubectl.sh get nodes --no-headers + '[' -n '' ']' + set -e + echo 'Nodes are ready:' Nodes are ready: + kubectl get nodes + cluster/kubectl.sh get nodes NAME STATUS ROLES AGE VERSION node01 Ready master 47s v1.10.3 node02 Ready 21s v1.10.3 + make cluster-sync ./cluster/build.sh Building ... sha256:ceba12cbc33e4e37a707840478a630db561e2427b78c8c9f9cd6d0b73276ab32 go version go1.10 linux/amd64 go version go1.10 linux/amd64 make[1]: Entering directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt' hack/dockerized "./hack/check.sh && KUBEVIRT_VERSION= ./hack/build-go.sh install " && ./hack/build-copy-artifacts.sh sha256:ceba12cbc33e4e37a707840478a630db561e2427b78c8c9f9cd6d0b73276ab32 go version go1.10 linux/amd64 go version go1.10 linux/amd64 find: '/root/go/src/kubevirt.io/kubevirt/_out/cmd': No such file or directory Compiling tests... compiled tests.test hack/build-docker.sh build Sending build context to Docker daemon 40.38 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-controller ---> Using cache ---> b4f3251c6468 Step 4/8 : WORKDIR /home/virt-controller ---> Using cache ---> 813752072d9d Step 5/8 : USER 1001 ---> Using cache ---> 88b3556f36b4 Step 6/8 : COPY virt-controller /usr/bin/virt-controller ---> e40e20bfd913 Removing intermediate container 103c44e18b3e Step 7/8 : ENTRYPOINT /usr/bin/virt-controller ---> Running in 0429e3852b72 ---> 5649e7c1f141 Removing intermediate container 0429e3852b72 Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release0" '' "virt-controller" '' ---> Running in 654a8b929912 ---> d4cf46aec922 Removing intermediate container 654a8b929912 Successfully built d4cf46aec922 Sending build context to Docker daemon 43.31 MB Step 1/10 : FROM kubevirt/libvirt:4.2.0 ---> 5f0bfe81a3e0 Step 2/10 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 09010a005182 Step 3/10 : RUN dnf -y install socat genisoimage util-linux libcgroup-tools ethtool net-tools sudo && dnf -y clean all && test $(id -u qemu) = 107 # make sure that the qemu user really is 107 ---> Using cache ---> fc9481693838 Step 4/10 : COPY virt-launcher /usr/bin/virt-launcher ---> d2f3d20792f0 Removing intermediate container d2a10488ff15 Step 5/10 : COPY kubevirt-sudo /etc/sudoers.d/kubevirt ---> 46a648a9c75c Removing intermediate container 7bbb85fe1372 Step 6/10 : RUN setcap CAP_NET_BIND_SERVICE=+eip /usr/bin/qemu-system-x86_64 ---> Running in 9bfddc76fa6c  ---> 06dfe1cb4f54 Removing intermediate container 9bfddc76fa6c Step 7/10 : RUN mkdir -p /usr/share/kubevirt/virt-launcher ---> Running in 1fbc07e08c19  ---> c807947dcb57 Removing intermediate container 1fbc07e08c19 Step 8/10 : COPY entrypoint.sh libvirtd.sh sock-connector /usr/share/kubevirt/virt-launcher/ ---> 716fcba14eed Removing intermediate container 5084b0a8e15b Step 9/10 : ENTRYPOINT /usr/share/kubevirt/virt-launcher/entrypoint.sh ---> Running in 726eddd5f54b ---> 869a5b2bf946 Removing intermediate container 726eddd5f54b Step 10/10 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release0" '' "virt-launcher" '' ---> Running in e5df40e5f633 ---> bac65fdbecf3 Removing intermediate container e5df40e5f633 Successfully built bac65fdbecf3 Sending build context to Docker daemon 41.69 MB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/5 : COPY virt-handler /usr/bin/virt-handler ---> d40c041d1ac0 Removing intermediate container e9a20fe01e95 Step 4/5 : ENTRYPOINT /usr/bin/virt-handler ---> Running in 0f2798fd4062 ---> ca804d8cfab3 Removing intermediate container 0f2798fd4062 Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release0" '' "virt-handler" '' ---> Running in 0d27d70c87ab ---> 4e057f8d5348 Removing intermediate container 0d27d70c87ab Successfully built 4e057f8d5348 Sending build context to Docker daemon 38.81 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-api ---> Using cache ---> 3cff23506e80 Step 4/8 : WORKDIR /home/virt-api ---> Using cache ---> e94c5606b96b Step 5/8 : USER 1001 ---> Using cache ---> af16317199f5 Step 6/8 : COPY virt-api /usr/bin/virt-api ---> fd4ef1550e4a Removing intermediate container ae5eb50034eb Step 7/8 : ENTRYPOINT /usr/bin/virt-api ---> Running in 581241de478f ---> e40b509c6723 Removing intermediate container 581241de478f Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release0" '' "virt-api" '' ---> Running in 4773f0f80fc7 ---> 9fcada066c15 Removing intermediate container 4773f0f80fc7 Successfully built 9fcada066c15 Sending build context to Docker daemon 4.096 kB Step 1/7 : FROM fedora:28 ---> cc510acfcd70 Step 2/7 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/7 : ENV container docker ---> Using cache ---> aed3ca4ac3a3 Step 4/7 : RUN mkdir -p /images/custom /images/alpine && truncate -s 64M /images/custom/disk.img && curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /images/alpine/disk.img ---> Using cache ---> c7d0cf8fc982 Step 5/7 : ADD entrypoint.sh / ---> Using cache ---> 0393e5ee0c37 Step 6/7 : CMD /entrypoint.sh ---> Using cache ---> 23798f49dea3 Step 7/7 : LABEL "disks-images-provider" '' "kubevirt-functional-tests-k8s-1.11.0-release0" '' ---> Using cache ---> 765ec8238c16 Successfully built 765ec8238c16 Sending build context to Docker daemon 2.56 kB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/5 : ENV container docker ---> Using cache ---> aed3ca4ac3a3 Step 4/5 : RUN dnf -y install procps-ng nmap-ncat && dnf -y clean all ---> Using cache ---> d8c990eaf575 Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release0" '' "vm-killer" '' ---> Using cache ---> 5ee0ce3a89ab Successfully built 5ee0ce3a89ab Sending build context to Docker daemon 5.12 kB Step 1/7 : FROM debian:sid ---> 68f33cf86aab Step 2/7 : MAINTAINER "David Vossel" \ ---> Using cache ---> 50fc79ebe51c Step 3/7 : ENV container docker ---> Using cache ---> b8e063496923 Step 4/7 : RUN apt-get update && apt-get install -y bash curl bzip2 qemu-utils && mkdir -p /disk && rm -rf /var/lib/apt/lists/* ---> Using cache ---> 8adb1572b35c Step 5/7 : ADD entry-point.sh / ---> Using cache ---> 8c0c5a52e4df Step 6/7 : CMD /entry-point.sh ---> Using cache ---> 1a4b838e5dee Step 7/7 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release0" '' "registry-disk-v1alpha" '' ---> Using cache ---> add62c0e1d95 Successfully built add62c0e1d95 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:32965/kubevirt/registry-disk-v1alpha:devel ---> add62c0e1d95 Step 2/4 : MAINTAINER "David Vossel" \ ---> Using cache ---> a5490aa9e5be Step 3/4 : RUN curl https://download.cirros-cloud.net/0.4.0/cirros-0.4.0-x86_64-disk.img > /disk/cirros.img ---> Using cache ---> 6331e7352723 Step 4/4 : LABEL "cirros-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.11.0-release0" '' ---> Using cache ---> 783f6a855eb1 Successfully built 783f6a855eb1 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:32965/kubevirt/registry-disk-v1alpha:devel ---> add62c0e1d95 Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 206ba1fb161b Step 3/4 : RUN curl -g -L https://download.fedoraproject.org/pub/fedora/linux/releases/27/CloudImages/x86_64/images/Fedora-Cloud-Base-27-1.6.x86_64.qcow2 > /disk/fedora.qcow2 ---> Using cache ---> 6626be85343a Step 4/4 : LABEL "fedora-cloud-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.11.0-release0" '' ---> Using cache ---> b0afdf8985de Successfully built b0afdf8985de Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:32965/kubevirt/registry-disk-v1alpha:devel ---> add62c0e1d95 Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 206ba1fb161b Step 3/4 : RUN curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /disk/alpine.iso ---> Using cache ---> 8256b847143d Step 4/4 : LABEL "alpine-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.11.0-release0" '' ---> Using cache ---> cd29fe0a448d Successfully built cd29fe0a448d Sending build context to Docker daemon 35.59 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virtctl ---> Using cache ---> d74088d7a4fc Step 4/8 : WORKDIR /home/virtctl ---> Using cache ---> c8c857bf8d96 Step 5/8 : USER 1001 ---> Using cache ---> 36730a67b946 Step 6/8 : COPY subresource-access-test /subresource-access-test ---> 8f761bd7e61f Removing intermediate container 0f1c45e72ce6 Step 7/8 : ENTRYPOINT /subresource-access-test ---> Running in d38447167035 ---> 0df79703c2d5 Removing intermediate container d38447167035 Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release0" '' "subresource-access-test" '' ---> Running in f7d661757755 ---> e8b3f9463641 Removing intermediate container f7d661757755 Successfully built e8b3f9463641 Sending build context to Docker daemon 3.072 kB Step 1/9 : FROM fedora:28 ---> cc510acfcd70 Step 2/9 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/9 : ENV container docker ---> Using cache ---> aed3ca4ac3a3 Step 4/9 : RUN dnf -y install make git gcc && dnf -y clean all ---> Using cache ---> 6050b24a5d85 Step 5/9 : ENV GIMME_GO_VERSION 1.9.2 ---> Using cache ---> 0447d2178073 Step 6/9 : RUN mkdir -p /gimme && curl -sL https://raw.githubusercontent.com/travis-ci/gimme/master/gimme | HOME=/gimme bash >> /etc/profile.d/gimme.sh ---> Using cache ---> 291db82d955f Step 7/9 : ENV GOPATH "/go" GOBIN "/usr/bin" ---> Using cache ---> 793556477837 Step 8/9 : RUN mkdir -p /go && source /etc/profile.d/gimme.sh && go get github.com/masterzen/winrm-cli ---> Using cache ---> fd5c6e1f9461 Step 9/9 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release0" '' "winrmcli" '' ---> Using cache ---> 8f05931f9af2 Successfully built 8f05931f9af2 Sending build context to Docker daemon 36.79 MB Step 1/5 : FROM fedora:27 ---> 9110ae7f579f Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 71a8c548e503 Step 3/5 : COPY example-hook-sidecar /example-hook-sidecar ---> ccb584c87bb9 Removing intermediate container fb468d642cf2 Step 4/5 : ENTRYPOINT /example-hook-sidecar ---> Running in bfd1ee1880d4 ---> 88c6de6e8bc7 Removing intermediate container bfd1ee1880d4 Step 5/5 : LABEL "example-hook-sidecar" '' "kubevirt-functional-tests-k8s-1.11.0-release0" '' ---> Running in 742126fc23b9 ---> 9cd9e6ec291a Removing intermediate container 742126fc23b9 Successfully built 9cd9e6ec291a hack/build-docker.sh push The push refers to a repository [localhost:32965/kubevirt/virt-controller] 40b121e65161: Preparing b2f5abdac324: Preparing 891e1e4ef82a: Preparing b2f5abdac324: Pushed 40b121e65161: Pushed 891e1e4ef82a: Pushed devel: digest: sha256:3e19d1d2fc701be83f289e3c4758216163fa9f29521b36a0596a3dcd93e1d6b7 size: 949 The push refers to a repository [localhost:32965/kubevirt/virt-launcher] e3173fa271ab: Preparing 7fe2a4444322: Preparing 8cf71f00028c: Preparing cdd674c84e53: Preparing 812cecc2c1a7: Preparing 0b99c4111657: Preparing da38cf808aa5: Preparing b83399358a92: Preparing 186d8b3e4fd8: Preparing fa6154170bf5: Preparing 5eefb9960a36: Preparing 891e1e4ef82a: Preparing da38cf808aa5: Waiting b83399358a92: Waiting 186d8b3e4fd8: Waiting fa6154170bf5: Waiting 5eefb9960a36: Waiting 891e1e4ef82a: Waiting 0b99c4111657: Waiting cdd674c84e53: Pushed e3173fa271ab: Pushed 7fe2a4444322: Pushed da38cf808aa5: Pushed b83399358a92: Pushed 186d8b3e4fd8: Pushed fa6154170bf5: Pushed 891e1e4ef82a: Mounted from kubevirt/virt-controller 8cf71f00028c: Pushed 812cecc2c1a7: Pushed 0b99c4111657: Pushed 5eefb9960a36: Pushed devel: digest: sha256:052039f6d73a95943c72597180a34478d76cb2a77a78a8729426e43474b5dc20 size: 2828 The push refers to a repository [localhost:32965/kubevirt/virt-handler] 41cb357213b4: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-launcher 41cb357213b4: Pushed devel: digest: sha256:7f122295060d1f3e66c98d3fd2ddd89f096783a62b9e6cac0da9a0d57dfb26a0 size: 741 The push refers to a repository [localhost:32965/kubevirt/virt-api] ba33c524fd16: Preparing afd1d781e4d1: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-handler afd1d781e4d1: Pushed ba33c524fd16: Pushed devel: digest: sha256:6820697cafe3dfb9e53d0756a067ca0278dd34ddffd5307c39577565b2d7b7bd size: 948 The push refers to a repository [localhost:32965/kubevirt/disks-images-provider] dc0875c44573: Preparing 8fc77a44094f: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-api dc0875c44573: Pushed 8fc77a44094f: Pushed devel: digest: sha256:cfc5e19e670a01645721f5abaa4403d2fc12abc46367fb2887257e1afa047423 size: 948 The push refers to a repository [localhost:32965/kubevirt/vm-killer] d1b69e768421: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/disks-images-provider d1b69e768421: Pushed devel: digest: sha256:d38761b5753d9010b1e9ad673a38b2ea45d1549351679e5f0313eaa510982293 size: 740 The push refers to a repository [localhost:32965/kubevirt/registry-disk-v1alpha] 2a15632f54d4: Preparing 91a924e03d7c: Preparing 25edbec0eaea: Preparing 2a15632f54d4: Pushed 91a924e03d7c: Pushed 25edbec0eaea: Pushed devel: digest: sha256:8038414ca4d2d4d28aa8194cacf73410bab5841d8d75a01392aead10f5d2e0d7 size: 948 The push refers to a repository [localhost:32965/kubevirt/cirros-registry-disk-demo] 88bb89162d63: Preparing 2a15632f54d4: Preparing 91a924e03d7c: Preparing 25edbec0eaea: Preparing 91a924e03d7c: Mounted from kubevirt/registry-disk-v1alpha 2a15632f54d4: Mounted from kubevirt/registry-disk-v1alpha 25edbec0eaea: Mounted from kubevirt/registry-disk-v1alpha 88bb89162d63: Pushed devel: digest: sha256:0f522b27abd68c33bb18831fba6371004bbf5eff60f58ce1945fd338493c4117 size: 1160 The push refers to a repository [localhost:32965/kubevirt/fedora-cloud-registry-disk-demo] f7e58045eb82: Preparing 2a15632f54d4: Preparing 91a924e03d7c: Preparing 25edbec0eaea: Preparing 91a924e03d7c: Mounted from kubevirt/cirros-registry-disk-demo 2a15632f54d4: Mounted from kubevirt/cirros-registry-disk-demo 25edbec0eaea: Mounted from kubevirt/cirros-registry-disk-demo f7e58045eb82: Pushed devel: digest: sha256:86e68a02d7ae02646d1734be2e9e0e2b54e4bf6a3657f47147a48d6978578493 size: 1161 The push refers to a repository [localhost:32965/kubevirt/alpine-registry-disk-demo] 802a64d5c8c0: Preparing 2a15632f54d4: Preparing 91a924e03d7c: Preparing 25edbec0eaea: Preparing 91a924e03d7c: Mounted from kubevirt/fedora-cloud-registry-disk-demo 25edbec0eaea: Mounted from kubevirt/fedora-cloud-registry-disk-demo 2a15632f54d4: Mounted from kubevirt/fedora-cloud-registry-disk-demo 802a64d5c8c0: Pushed devel: digest: sha256:bcbcafc5fa6623ceaed64c3b463dbe456b352d541c65412b4b238915fbd663b8 size: 1160 The push refers to a repository [localhost:32965/kubevirt/subresource-access-test] aa7efe7716c1: Preparing 4052ce9d0aff: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/vm-killer 4052ce9d0aff: Pushed aa7efe7716c1: Pushed devel: digest: sha256:ebcbeb95aa6ca824b7480fa61072e7bd99108a0ccd27bd3cfade9a5b6556c43a size: 948 The push refers to a repository [localhost:32965/kubevirt/winrmcli] 64ccc7ac4271: Preparing 4242962b50c3: Preparing 0e374d8c733e: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/subresource-access-test 64ccc7ac4271: Pushed 0e374d8c733e: Pushed 4242962b50c3: Pushed devel: digest: sha256:09b2bc3b44366be5fed66f121c9d70a1775010a64b55e0fd3f96ddfc4f2e1c69 size: 1165 The push refers to a repository [localhost:32965/kubevirt/example-hook-sidecar] dd3b7aae9cec: Preparing 39bae602f753: Preparing dd3b7aae9cec: Pushed 39bae602f753: Pushed devel: digest: sha256:c58b61a3fb1872f74383837fc2f98a8750ecbd32f70efc12fc5e9169276918ae size: 740 make[1]: Leaving directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt' Done ./cluster/clean.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.11.0-release ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.11.0-release0 ++ job_prefix=kubevirt-functional-tests-k8s-1.11.0-release0 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-138-g319cb65 ++ KUBEVIRT_VERSION=v0.7.0-138-g319cb65 + source cluster/k8s-1.10.3/provider.sh ++ set -e ++ image=k8s-1.10.3@sha256:d6290260e7e6b84419984f12719cf592ccbe327373b8df76aa0481f8ec01d357 ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ source hack/config-default.sh source hack/config-k8s-1.10.3.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.10.3.sh ++ source hack/config-provider-k8s-1.10.3.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.3/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.3/.kubectl +++ docker_prefix=localhost:32965/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Cleaning up ...' Cleaning up ... + cluster/kubectl.sh get vmis --all-namespaces -o=custom-columns=NAME:.metadata.name,NAMESPACE:.metadata.namespace,FINALIZERS:.metadata.finalizers --no-headers + read p + grep foregroundDeleteVirtualMachine error: the server doesn't have a resource type "vmis" + _kubectl delete ds -l kubevirt.io -n kube-system --cascade=false --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=libvirt --force --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=virt-handler --force --grace-period 0 No resources found + namespaces=(default ${namespace}) + for i in '${namespaces[@]}' + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete deployment -l kubevirt.io No resources found + _kubectl -n default delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete rs -l kubevirt.io No resources found + _kubectl -n default delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete services -l kubevirt.io No resources found + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n default delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete secrets -l kubevirt.io No resources found + _kubectl -n default delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete pv -l kubevirt.io No resources found + _kubectl -n default delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete pvc -l kubevirt.io No resources found + _kubectl -n default delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete ds -l kubevirt.io No resources found + _kubectl -n default delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n default delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete pods -l kubevirt.io No resources found + _kubectl -n default delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n default delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete rolebinding -l kubevirt.io No resources found + _kubectl -n default delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete roles -l kubevirt.io No resources found + _kubectl -n default delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete clusterroles -l kubevirt.io No resources found + _kubectl -n default delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n default delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n default get crd offlinevirtualmachines.kubevirt.io ++ wc -l ++ export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig ++ cluster/k8s-1.10.3/.kubectl -n default get crd offlinevirtualmachines.kubevirt.io Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + for i in '${namespaces[@]}' + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete deployment -l kubevirt.io No resources found + _kubectl -n kube-system delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete rs -l kubevirt.io No resources found + _kubectl -n kube-system delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete services -l kubevirt.io No resources found + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n kube-system delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete secrets -l kubevirt.io No resources found + _kubectl -n kube-system delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete pv -l kubevirt.io No resources found + _kubectl -n kube-system delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete pvc -l kubevirt.io No resources found + _kubectl -n kube-system delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete ds -l kubevirt.io No resources found + _kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n kube-system delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete pods -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete rolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete roles -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete clusterroles -l kubevirt.io No resources found + _kubectl -n kube-system delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl -n kube-system delete serviceaccounts -l kubevirt.io No resources found ++ wc -l ++ _kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io ++ export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig ++ cluster/k8s-1.10.3/.kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + sleep 2 + echo Done Done ./cluster/deploy.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.11.0-release ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.11.0-release0 ++ job_prefix=kubevirt-functional-tests-k8s-1.11.0-release0 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-138-g319cb65 ++ KUBEVIRT_VERSION=v0.7.0-138-g319cb65 + source cluster/k8s-1.10.3/provider.sh ++ set -e ++ image=k8s-1.10.3@sha256:d6290260e7e6b84419984f12719cf592ccbe327373b8df76aa0481f8ec01d357 ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ KUBEVIRT_PROVIDER=k8s-1.10.3 ++ source hack/config-default.sh source hack/config-k8s-1.10.3.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.10.3.sh ++ source hack/config-provider-k8s-1.10.3.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.3/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.3/.kubectl +++ docker_prefix=localhost:32965/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Deploying ...' Deploying ... + [[ -z k8s-1.11.0-release ]] + [[ k8s-1.11.0-release =~ .*-dev ]] + [[ k8s-1.11.0-release =~ .*-release ]] + for manifest in '${MANIFESTS_OUT_DIR}/release/*' + [[ /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/demo-content.yaml =~ .*demo.* ]] + continue + for manifest in '${MANIFESTS_OUT_DIR}/release/*' + [[ /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml =~ .*demo.* ]] + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml clusterrole.rbac.authorization.k8s.io "kubevirt.io:admin" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:edit" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:view" created serviceaccount "kubevirt-apiserver" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-apiserver-auth-delegator" created rolebinding.rbac.authorization.k8s.io "kubevirt-apiserver" created role.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrole.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrole.rbac.authorization.k8s.io "kubevirt-controller" created serviceaccount "kubevirt-controller" created serviceaccount "kubevirt-privileged" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-controller" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-controller-cluster-admin" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-privileged-cluster-admin" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:default" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt.io:default" created service "virt-api" created deployment.extensions "virt-api" created deployment.extensions "virt-controller" created daemonset.extensions "virt-handler" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstances.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstancereplicasets.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstancepresets.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachines.kubevirt.io" created + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R + export KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.3/.kubeconfig + cluster/k8s-1.10.3/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R persistentvolumeclaim "disk-alpine" created persistentvolume "host-path-disk-alpine" created persistentvolumeclaim "disk-custom" created persistentvolume "host-path-disk-custom" created daemonset.extensions "disks-images-provider" created serviceaccount "kubevirt-testing" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-testing-cluster-admin" created + [[ k8s-1.10.3 =~ os-* ]] + echo Done Done + namespaces=(kube-system default) + [[ kube-system != \k\u\b\e\-\s\y\s\t\e\m ]] + timeout=300 + sample=30 + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n kube-system --no-headers ++ cluster/kubectl.sh get pods -n kube-system --no-headers ++ grep -v Running + '[' -n 'virt-api-7d79764579-zn76g 0/1 ContainerCreating 0 6s virt-api-7d79764579-zplcq 0/1 ContainerCreating 0 6s virt-controller-7d57d96b65-6dx5k 0/1 ContainerCreating 0 5s virt-controller-7d57d96b65-ct5kr 0/1 ContainerCreating 0 6s virt-handler-2qczc 0/1 ContainerCreating 0 5s virt-handler-n5fk6 0/1 ContainerCreating 0 6s' ']' + echo 'Waiting for kubevirt pods to enter the Running state ...' Waiting for kubevirt pods to enter the Running state ... + kubectl get pods -n kube-system --no-headers + cluster/kubectl.sh get pods -n kube-system --no-headers + grep -v Running disks-images-provider-tbdb6 0/1 ContainerCreating 0 2s disks-images-provider-zk9q6 0/1 Pending 0 2s virt-api-7d79764579-zn76g 0/1 ContainerCreating 0 8s virt-api-7d79764579-zplcq 0/1 ContainerCreating 0 8s virt-controller-7d57d96b65-6dx5k 0/1 ContainerCreating 0 7s virt-controller-7d57d96b65-ct5kr 0/1 ContainerCreating 0 8s virt-handler-2qczc 0/1 ContainerCreating 0 7s virt-handler-n5fk6 0/1 ContainerCreating 0 8s + sleep 30 + current_time=30 + '[' 30 -gt 300 ']' ++ kubectl get pods -n kube-system --no-headers ++ cluster/kubectl.sh get pods -n kube-system --no-headers ++ grep -v Running + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n false ']' + echo 'Waiting for KubeVirt containers to become ready ...' Waiting for KubeVirt containers to become ready ... + kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + grep false + cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers false + sleep 30 + current_time=30 + '[' 30 -gt 300 ']' ++ kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n '' ']' + kubectl get pods -n kube-system + cluster/kubectl.sh get pods -n kube-system NAME READY STATUS RESTARTS AGE disks-images-provider-tbdb6 1/1 Running 0 1m disks-images-provider-zk9q6 1/1 Running 0 1m etcd-node01 1/1 Running 0 7m kube-apiserver-node01 1/1 Running 0 7m kube-controller-manager-node01 1/1 Running 0 7m kube-dns-86f4d74b45-5tv6z 3/3 Running 0 8m kube-flannel-ds-sb69k 1/1 Running 0 8m kube-flannel-ds-zgn4b 1/1 Running 0 8m kube-proxy-4bv54 1/1 Running 0 8m kube-proxy-ltrwx 1/1 Running 0 8m kube-scheduler-node01 1/1 Running 0 7m virt-api-7d79764579-zn76g 1/1 Running 0 1m virt-api-7d79764579-zplcq 1/1 Running 0 1m virt-controller-7d57d96b65-6dx5k 1/1 Running 0 1m virt-controller-7d57d96b65-ct5kr 1/1 Running 0 1m virt-handler-2qczc 1/1 Running 0 1m virt-handler-n5fk6 1/1 Running 0 1m + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n default --no-headers ++ grep -v Running ++ cluster/kubectl.sh get pods -n default --no-headers No resources found. + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n '' ']' + kubectl get pods -n default + cluster/kubectl.sh get pods -n default No resources found. + kubectl version + cluster/kubectl.sh version Client Version: version.Info{Major:"1", Minor:"10", GitVersion:"v1.10.3", GitCommit:"2bba0127d85d5a46ab4b778548be28623b32d0b0", GitTreeState:"clean", BuildDate:"2018-05-21T09:17:39Z", GoVersion:"go1.9.3", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"10", GitVersion:"v1.10.3", GitCommit:"2bba0127d85d5a46ab4b778548be28623b32d0b0", GitTreeState:"clean", BuildDate:"2018-05-21T09:05:37Z", GoVersion:"go1.9.3", Compiler:"gc", Platform:"linux/amd64"} + ginko_params='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/junit.xml' + [[ k8s-1.11.0-release =~ windows.* ]] + FUNC_TEST_ARGS='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release/junit.xml' + make functest hack/dockerized "hack/build-func-tests.sh" sha256:ceba12cbc33e4e37a707840478a630db561e2427b78c8c9f9cd6d0b73276ab32 go version go1.10 linux/amd64 go version go1.10 linux/amd64 Compiling tests... compiled tests.test hack/functests.sh Running Suite: Tests Suite ========================== Random Seed: 1532672897 Will run 148 of 148 specs ••••••••••• ------------------------------ • [SLOW TEST:37.997 seconds] LeaderElection /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:43 Start a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:53 when the controller pod is not running /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:54 should success /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:55 ------------------------------ Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running 2018/07/27 06:30:38 http: TLS handshake error from 10.244.0.1:57670: EOF 2018/07/27 06:30:48 http: TLS handshake error from 10.244.0.1:57694: EOF level=info timestamp=2018-07-27T06:30:55.202161Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T06:30:56.790952Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 06:30:58 http: TLS handshake error from 10.244.0.1:57718: EOF 2018/07/27 06:31:08 http: TLS handshake error from 10.244.0.1:57742: EOF 2018/07/27 06:31:18 http: TLS handshake error from 10.244.0.1:57766: EOF level=info timestamp=2018-07-27T06:31:25.236844Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T06:31:26.816801Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 06:31:28 http: TLS handshake error from 10.244.0.1:57790: EOF 2018/07/27 06:31:38 http: TLS handshake error from 10.244.0.1:57814: EOF 2018/07/27 06:31:48 http: TLS handshake error from 10.244.0.1:57838: EOF level=info timestamp=2018-07-27T06:31:55.268895Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T06:31:56.846360Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 06:31:58 http: TLS handshake error from 10.244.0.1:57862: EOF Pod name: virt-api-7d79764579-zplcq Pod phase: Running 2018/07/27 06:30:09 http: TLS handshake error from 10.244.1.1:46648: EOF 2018/07/27 06:30:19 http: TLS handshake error from 10.244.1.1:46654: EOF 2018/07/27 06:30:29 http: TLS handshake error from 10.244.1.1:46660: EOF level=info timestamp=2018-07-27T06:30:34.934859Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 06:30:39 http: TLS handshake error from 10.244.1.1:46666: EOF 2018/07/27 06:30:49 http: TLS handshake error from 10.244.1.1:46672: EOF 2018/07/27 06:30:59 http: TLS handshake error from 10.244.1.1:46678: EOF level=info timestamp=2018-07-27T06:31:04.957664Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 06:31:09 http: TLS handshake error from 10.244.1.1:46684: EOF 2018/07/27 06:31:19 http: TLS handshake error from 10.244.1.1:46690: EOF 2018/07/27 06:31:29 http: TLS handshake error from 10.244.1.1:46696: EOF level=info timestamp=2018-07-27T06:31:34.938402Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 06:31:39 http: TLS handshake error from 10.244.1.1:46702: EOF 2018/07/27 06:31:49 http: TLS handshake error from 10.244.1.1:46708: EOF 2018/07/27 06:31:59 http: TLS handshake error from 10.244.1.1:46714: EOF Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T06:28:44.538861Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-07-27T06:28:44.538994Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-07-27T06:28:44.539040Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-07-27T06:28:44.539061Z pos=virtinformers.go:104 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-07-27T06:28:44.539308Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-07-27T06:28:44.548828Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-07-27T06:28:44.548962Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-07-27T06:28:44.549039Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-07-27T06:28:44.549226Z pos=preset.go:71 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-07-27T06:28:45.156487Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmim9s5p kind= uid=501bebb3-9166-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:28:45.167382Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmim9s5p kind= uid=501bebb3-9166-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:28:45.267410Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmim9s5p\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmim9s5p" level=info timestamp=2018-07-27T06:29:02.866354Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmib6z8z kind= uid=5aa9732a-9166-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:29:02.866521Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmib6z8z kind= uid=5aa9732a-9166-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:29:02.915799Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmib6z8z\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmib6z8z" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2qczc Pod phase: Running level=info timestamp=2018-07-27T06:29:02.585403Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmim9s5p kind= uid=501bebb3-9166-11e8-86a4-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-27T06:29:02.585939Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmim9s5p kind= uid=501bebb3-9166-11e8-86a4-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmim9s5p" level=info timestamp=2018-07-27T06:29:02.807523Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmim9s5p kind= uid=501bebb3-9166-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:29:02.807729Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmim9s5p kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-27T06:29:02.807779Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmim9s5p kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-27T06:29:02.808388Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmim9s5p kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:29:02.810575Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T06:29:02.810750Z pos=vm.go:678 component=virt-handler namespace=kubevirt-test-default name=testvmim9s5p kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-27T06:29:02.810830Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmim9s5p kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T06:29:02.810899Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmim9s5p kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:29:02.813848Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T06:29:03.288361Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmim9s5p kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T06:29:03.288496Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmim9s5p kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:29:03.289233Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmim9s5p kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T06:29:03.289333Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmim9s5p kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-n5fk6 Pod phase: Running level=info timestamp=2018-07-27T06:26:59.360465Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-07-27T06:26:59.368448Z pos=vm.go:210 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-07-27T06:26:59.370331Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-07-27T06:26:59.469715Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T06:26:59.617851Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T06:26:59.630449Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-launcher-testvmib6z8z-2jbsj Pod phase: Running level=info timestamp=2018-07-27T06:29:06.207295Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T06:29:06.207576Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T06:29:06.210392Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T06:29:16.218495Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T06:29:16.266495Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmib6z8z" level=info timestamp=2018-07-27T06:29:16.268546Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T06:29:16.268862Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.906 seconds] Health Monitoring /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:37 A VirtualMachineInstance with a watchdog device /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:56 should be shut down when the watchdog expires [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:57 Timed out after 90.010s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1049 ------------------------------ STEP: Starting a VirtualMachineInstance level=info timestamp=2018-07-27T06:29:02.961780Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmib6z8z-2jbsj" S [SKIPPING] in Spec Setup (BeforeEach) [0.008 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 should succeed to start a vmi [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:133 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1352 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.004 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 should succeed to stop a running vmi [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:139 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1352 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.006 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 with winrm connection [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:150 should have correct UUID /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:192 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1352 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.005 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 with winrm connection [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:150 should have pod IP /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:208 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1352 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.005 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 with kubectl command [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:226 should succeed to start a vmi /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:242 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1352 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.010 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 with kubectl command [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:226 should succeed to stop a vmi /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:250 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1352 ------------------------------ • [SLOW TEST:49.766 seconds] CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 with cloudInitNoCloud userDataBase64 source /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:81 should have cloud-init data /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:82 ------------------------------ 2018/07/27 02:32:53 read closing down: EOF • [SLOW TEST:163.617 seconds] CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 with cloudInitNoCloud userDataBase64 source /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:81 with injected ssh-key /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:92 should have ssh-key under authorized keys /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:93 ------------------------------ 2018/07/27 02:35:36 read closing down: EOF 2018/07/27 02:36:25 read closing down: EOF • [SLOW TEST:59.128 seconds] CloudInit UserData 2018/07/27 02:36:36 read closing down: EOF /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 with cloudInitNoCloud userData source /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:118 should process provided cloud-init data /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:119 ------------------------------ 2018/07/27 02:37:25 read closing down: EOF • [SLOW TEST:49.434 seconds] CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 should take user-data from k8s secret /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:162 ------------------------------ • [SLOW TEST:50.132 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 with a cirros image /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:67 2018/07/27 02:38:15 read closing down: EOF should return that we are running cirros /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:68 ------------------------------ • [SLOW TEST:60.090 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 with a fedora image /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:77 2018/07/27 02:39:15 read closing down: EOF should return that we are running fedora /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:78 ------------------------------ Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running level=info timestamp=2018-07-27T06:39:28.044161Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi8whpl/console proto=HTTP/1.1 statusCode=400 contentLength=90 2018/07/27 06:39:28 http: TLS handshake error from 10.244.0.1:59094: EOF level=error timestamp=2018-07-27T06:39:35.024082Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:39:35.024168Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi8whpl/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=error timestamp=2018-07-27T06:39:36.078551Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:39:36.078644Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi8whpl/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=error timestamp=2018-07-27T06:39:38.185427Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:39:38.185518Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi8whpl/console proto=HTTP/1.1 statusCode=400 contentLength=90 2018/07/27 06:39:38 http: TLS handshake error from 10.244.0.1:59136: EOF level=error timestamp=2018-07-27T06:39:39.239337Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:39:39.239433Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi8whpl/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=error timestamp=2018-07-27T06:39:41.333953Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:39:41.334051Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi8whpl/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=error timestamp=2018-07-27T06:39:42.384645Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:39:42.384790Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi8whpl/console proto=HTTP/1.1 statusCode=400 contentLength=90 Pod name: virt-api-7d79764579-zplcq Pod phase: Running level=info timestamp=2018-07-27T06:39:33.294360Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi8whpl/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=info timestamp=2018-07-27T06:39:34.953321Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=error timestamp=2018-07-27T06:39:37.068251Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:39:37.068402Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi8whpl/console proto=HTTP/1.1 statusCode=400 contentLength=90 2018/07/27 06:39:39 http: TLS handshake error from 10.244.1.1:46994: EOF level=error timestamp=2018-07-27T06:39:40.213564Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:39:40.213674Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi8whpl/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=error timestamp=2018-07-27T06:39:43.363350Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:39:43.363491Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi8whpl/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=error timestamp=2018-07-27T06:39:44.418591Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:39:44.418736Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi8whpl/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=error timestamp=2018-07-27T06:39:45.468392Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:39:45.468471Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi8whpl/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=error timestamp=2018-07-27T06:39:46.516993Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:39:46.517081Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi8whpl/console proto=HTTP/1.1 statusCode=400 contentLength=90 Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T06:35:37.280600Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmicwzsk\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmicwzsk" level=info timestamp=2018-07-27T06:35:37.297092Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmicwzsk\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmicwzsk" level=info timestamp=2018-07-27T06:36:36.351324Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisrqbw kind= uid=68f58bea-9167-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:36:36.351474Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisrqbw kind= uid=68f58bea-9167-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:37:25.767359Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifngpm kind= uid=866a0c6d-9167-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:37:25.768835Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifngpm kind= uid=866a0c6d-9167-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:37:25.823962Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifngpm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifngpm" level=info timestamp=2018-07-27T06:38:15.898229Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:38:15.898400Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:38:15.983470Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwzrcq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwzrcq" level=info timestamp=2018-07-27T06:38:15.998269Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwzrcq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwzrcq" level=info timestamp=2018-07-27T06:39:16.605301Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8whpl kind= uid=c8791698-9167-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:39:16.605543Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8whpl kind= uid=c8791698-9167-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:39:17.304301Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8whpl\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8whpl" level=info timestamp=2018-07-27T06:39:17.351311Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8whpl\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8whpl" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2qczc Pod phase: Running level=info timestamp=2018-07-27T06:38:37.190213Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="No update processing required" level=info timestamp=2018-07-27T06:38:37.197816Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:38:37.197884Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:38:37.201910Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:39:16.321600Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-27T06:39:16.330707Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-27T06:39:16.331867Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmiwzrcq" level=info timestamp=2018-07-27T06:39:16.812524Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T06:39:16.812861Z pos=vm.go:678 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-27T06:39:16.813209Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:39:16.813295Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T06:39:16.813356Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:39:16.814393Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T06:39:16.814594Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:39:16.815211Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" Pod name: virt-handler-n5fk6 Pod phase: Running level=info timestamp=2018-07-27T06:26:59.360465Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-07-27T06:26:59.368448Z pos=vm.go:210 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-07-27T06:26:59.370331Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-07-27T06:26:59.469715Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T06:26:59.617851Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T06:26:59.630449Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-launcher-testvmi8whpl-ffmbp Pod phase: Running level=info timestamp=2018-07-27T06:39:20.948425Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T06:39:20.948775Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T06:39:20.950099Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T06:39:30.959202Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T06:39:30.980451Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi8whpl" level=info timestamp=2018-07-27T06:39:30.982526Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T06:39:30.982802Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [30.756 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 should be able to reconnect to console multiple times [It] /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:87 Expected error: <*errors.errorString | 0xc4206e1bd0>: { s: "Timeout trying to connect to the virtual machine instance", } Timeout trying to connect to the virtual machine instance not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:100 ------------------------------ STEP: Creating a new VirtualMachineInstance STEP: Checking that the console output equals to expected one Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running level=error timestamp=2018-07-27T06:40:06.118402Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:40:06.118509Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi5p8wz/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=error timestamp=2018-07-27T06:40:07.171070Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:40:07.171160Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi5p8wz/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=error timestamp=2018-07-27T06:40:08.223298Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:40:08.223508Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi5p8wz/console proto=HTTP/1.1 statusCode=400 contentLength=90 2018/07/27 06:40:08 http: TLS handshake error from 10.244.0.1:59268: EOF level=info timestamp=2018-07-27T06:40:10.397066Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-27T06:40:10.397766Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=error timestamp=2018-07-27T06:40:11.380745Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:40:11.380837Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi5p8wz/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=error timestamp=2018-07-27T06:40:15.602952Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:40:15.603078Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi5p8wz/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=error timestamp=2018-07-27T06:40:16.659005Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:40:16.659088Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi5p8wz/console proto=HTTP/1.1 statusCode=400 contentLength=90 Pod name: virt-api-7d79764579-zplcq Pod phase: Running level=info timestamp=2018-07-27T06:40:00.758766Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi5p8wz/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=error timestamp=2018-07-27T06:40:02.865478Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:40:02.865567Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi5p8wz/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=info timestamp=2018-07-27T06:40:04.963993Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=error timestamp=2018-07-27T06:40:09.203578Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:40:09.203662Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi5p8wz/console proto=HTTP/1.1 statusCode=400 contentLength=90 2018/07/27 06:40:09 http: TLS handshake error from 10.244.1.1:47012: EOF level=error timestamp=2018-07-27T06:40:10.254091Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:40:10.254158Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi5p8wz/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=error timestamp=2018-07-27T06:40:12.357794Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:40:12.357889Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi5p8wz/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=error timestamp=2018-07-27T06:40:13.422591Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:40:13.422702Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi5p8wz/console proto=HTTP/1.1 statusCode=400 contentLength=90 level=error timestamp=2018-07-27T06:40:14.473575Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduling instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-07-27T06:40:14.473708Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi5p8wz/console proto=HTTP/1.1 statusCode=400 contentLength=90 Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T06:36:36.351324Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisrqbw kind= uid=68f58bea-9167-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:36:36.351474Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisrqbw kind= uid=68f58bea-9167-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:37:25.767359Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifngpm kind= uid=866a0c6d-9167-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:37:25.768835Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifngpm kind= uid=866a0c6d-9167-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:37:25.823962Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifngpm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifngpm" level=info timestamp=2018-07-27T06:38:15.898229Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:38:15.898400Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:38:15.983470Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwzrcq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwzrcq" level=info timestamp=2018-07-27T06:38:15.998269Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwzrcq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwzrcq" level=info timestamp=2018-07-27T06:39:16.605301Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8whpl kind= uid=c8791698-9167-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:39:16.605543Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8whpl kind= uid=c8791698-9167-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:39:17.304301Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8whpl\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8whpl" level=info timestamp=2018-07-27T06:39:17.351311Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8whpl\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8whpl" level=info timestamp=2018-07-27T06:39:47.112279Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5p8wz kind= uid=daa95640-9167-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:39:47.112410Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5p8wz kind= uid=daa95640-9167-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2qczc Pod phase: Running level=info timestamp=2018-07-27T06:38:37.190213Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="No update processing required" level=info timestamp=2018-07-27T06:38:37.197816Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:38:37.197884Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:38:37.201910Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:39:16.321600Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-27T06:39:16.330707Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-27T06:39:16.331867Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmiwzrcq" level=info timestamp=2018-07-27T06:39:16.812524Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T06:39:16.812861Z pos=vm.go:678 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-27T06:39:16.813209Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind= uid=a44bc930-9167-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:39:16.813295Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T06:39:16.813356Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:39:16.814393Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T06:39:16.814594Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwzrcq kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:39:16.815211Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" Pod name: virt-handler-n5fk6 Pod phase: Running level=info timestamp=2018-07-27T06:26:59.360465Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-07-27T06:26:59.368448Z pos=vm.go:210 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-07-27T06:26:59.370331Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-07-27T06:26:59.469715Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T06:26:59.617851Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T06:26:59.630449Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-launcher-testvmi5p8wz-kjtfz Pod phase: Running level=info timestamp=2018-07-27T06:39:50.474570Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T06:39:50.475620Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T06:39:50.477117Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T06:40:00.482754Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T06:40:00.529537Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi5p8wz" level=info timestamp=2018-07-27T06:40:00.531302Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T06:40:00.531556Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmi8whpl-ffmbp Pod phase: Failed Unable to retrieve container logs for docker://b494533edbf3291b698c12cd69597b05ab14dadb9a8dd3242c932e826a0e3d5a• Failure [30.641 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 should wait until the virtual machine is in running state and return a stream interface [It] /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:103 Expected error: <*errors.errorString | 0xc42089aa90>: { s: "Timeout trying to connect to the virtual machine instance", } Timeout trying to connect to the virtual machine instance not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:109 ------------------------------ STEP: Creating a new VirtualMachineInstance • [SLOW TEST:30.239 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 should fail waiting for the virtual machine instance to be running /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:111 ------------------------------ • [SLOW TEST:30.220 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 should fail waiting for the expecter /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:134 ------------------------------ 2018/07/27 02:42:06 read closing down: EOF 2018/07/27 02:42:56 read closing down: EOF • [SLOW TEST:100.549 seconds] 2018/07/27 02:42:58 read closing down: EOF Slirp /root/go/src/kubevirt.io/kubevirt/tests/vmi_slirp_interface_test.go:39 should be able to /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 VirtualMachineInstance with slirp interface /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ 2018/07/27 02:42:59 read closing down: EOF ••2018/07/27 02:43:49 read closing down: EOF Service cluster-ip-vmi successfully exposed for virtualmachineinstance testvmilnhpj ------------------------------ • [SLOW TEST:53.421 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on a VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:61 Expose ClusterIP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:68 Should expose a Cluster IP service on a VMI and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:71 ------------------------------ Service cluster-ip-target-vmi successfully exposed for virtualmachineinstance testvmilnhpj •Service node-port-vmi successfully exposed for virtualmachineinstance testvmilnhpj ------------------------------ • [SLOW TEST:10.137 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on a VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:61 Expose NodePort service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:124 Should expose a NodePort service on a VMI and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:129 ------------------------------ 2018/07/27 02:44:53 read closing down: EOF Service cluster-ip-udp-vmi successfully exposed for virtualmachineinstance testvmicxdv5 • [SLOW TEST:54.953 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose UDP service on a VMI /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:166 Expose ClusterIP UDP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:173 Should expose a ClusterIP service on a VMI and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:177 ------------------------------ Service node-port-udp-vmi successfully exposed for virtualmachineinstance testvmicxdv5 • [SLOW TEST:9.128 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose UDP service on a VMI /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:166 Expose NodePort UDP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:205 Should expose a NodePort service on a VMI and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:210 ------------------------------ Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running 2018/07/27 06:45:38 http: TLS handshake error from 10.244.0.1:60260: EOF 2018/07/27 06:45:48 http: TLS handshake error from 10.244.0.1:60284: EOF level=info timestamp=2018-07-27T06:45:56.166113Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T06:45:57.849938Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 06:45:58 http: TLS handshake error from 10.244.0.1:60308: EOF 2018/07/27 06:46:08 http: TLS handshake error from 10.244.0.1:60332: EOF 2018/07/27 06:46:18 http: TLS handshake error from 10.244.0.1:60356: EOF level=info timestamp=2018-07-27T06:46:26.202384Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T06:46:27.872239Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 06:46:28 http: TLS handshake error from 10.244.0.1:60380: EOF 2018/07/27 06:46:38 http: TLS handshake error from 10.244.0.1:60404: EOF 2018/07/27 06:46:48 http: TLS handshake error from 10.244.0.1:60428: EOF level=info timestamp=2018-07-27T06:46:56.248417Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T06:46:57.906025Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 06:46:58 http: TLS handshake error from 10.244.0.1:60452: EOF Pod name: virt-api-7d79764579-zplcq Pod phase: Running 2018/07/27 06:45:19 http: TLS handshake error from 10.244.1.1:47254: EOF 2018/07/27 06:45:29 http: TLS handshake error from 10.244.1.1:47260: EOF level=info timestamp=2018-07-27T06:45:34.948534Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 06:45:39 http: TLS handshake error from 10.244.1.1:47266: EOF 2018/07/27 06:45:49 http: TLS handshake error from 10.244.1.1:47272: EOF 2018/07/27 06:45:59 http: TLS handshake error from 10.244.1.1:47278: EOF level=info timestamp=2018-07-27T06:46:04.942981Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 06:46:09 http: TLS handshake error from 10.244.1.1:47290: EOF 2018/07/27 06:46:19 http: TLS handshake error from 10.244.1.1:47296: EOF 2018/07/27 06:46:29 http: TLS handshake error from 10.244.1.1:47302: EOF level=info timestamp=2018-07-27T06:46:34.941800Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 06:46:39 http: TLS handshake error from 10.244.1.1:47308: EOF 2018/07/27 06:46:49 http: TLS handshake error from 10.244.1.1:47314: EOF 2018/07/27 06:46:59 http: TLS handshake error from 10.244.1.1:47320: EOF level=info timestamp=2018-07-27T06:47:04.956235Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T06:43:00.599924Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmilnhpj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmilnhpj" level=info timestamp=2018-07-27T06:44:04.113742Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicxdv5 kind= uid=73d9006c-9168-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:44:04.113850Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicxdv5 kind= uid=73d9006c-9168-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:45:08.400159Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4ntsphlqc kind= uid=9a2ab1e9-9168-11e8-86a4-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=error timestamp=2018-07-27T06:45:08.400329Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4ntsphlqc kind= uid=9a2ab1e9-9168-11e8-86a4-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmiv4ntsphlqc" level=info timestamp=2018-07-27T06:45:08.400524Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4ntsphlqc kind= uid=9a2ab1e9-9168-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:45:08.400576Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4ntsphlqc kind= uid=9a2ab1e9-9168-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:45:08.417171Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4nts29r5c kind= uid=9a2b6738-9168-11e8-86a4-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=error timestamp=2018-07-27T06:45:08.417249Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4nts29r5c kind= uid=9a2b6738-9168-11e8-86a4-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmiv4nts29r5c" level=info timestamp=2018-07-27T06:45:08.417346Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4nts29r5c kind= uid=9a2b6738-9168-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:45:08.418400Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv4nts29r5c kind= uid=9a2b6738-9168-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=error timestamp=2018-07-27T06:45:08.468447Z pos=replicaset.go:230 component=virt-controller service=http namespace=kubevirt-test-default name=replicasetkwc4n kind= uid=9a0cc7f4-9168-11e8-86a4-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstancereplicasets.kubevirt.io \"replicasetkwc4n\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the replicaset status failed." level=info timestamp=2018-07-27T06:45:08.468515Z pos=replicaset.go:137 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstancereplicasets.kubevirt.io \"replicasetkwc4n\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachineInstanceReplicaSet kubevirt-test-default/replicasetkwc4n" level=error timestamp=2018-07-27T06:45:08.594313Z pos=replicaset.go:230 component=virt-controller service=http namespace=kubevirt-test-default name=replicasetkwc4n kind= uid=9a0cc7f4-9168-11e8-86a4-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstancereplicasets.kubevirt.io \"replicasetkwc4n\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the replicaset status failed." level=info timestamp=2018-07-27T06:45:08.594413Z pos=replicaset.go:137 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstancereplicasets.kubevirt.io \"replicasetkwc4n\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachineInstanceReplicaSet kubevirt-test-default/replicasetkwc4n" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2qczc Pod phase: Running level=info timestamp=2018-07-27T06:44:19.515373Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmicxdv5 kind= uid=73d9006c-9168-11e8-86a4-525500d15501 msg="No update processing required" level=info timestamp=2018-07-27T06:44:19.525465Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmicxdv5 kind= uid=73d9006c-9168-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:44:19.525520Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmicxdv5 kind= uid=73d9006c-9168-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:44:19.529223Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmicxdv5 kind= uid=73d9006c-9168-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:45:22.848018Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmiv4nts29r5c kind= uid=9a2b6738-9168-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:45:23.553831Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type ADDED" level=info timestamp=2018-07-27T06:45:23.554257Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmiv4nts29r5c kind=Domain uid=9a2b6738-9168-11e8-86a4-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-27T06:45:23.840104Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-27T06:45:23.841215Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmiv4nts29r5c kind=Domain uid=9a2b6738-9168-11e8-86a4-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-27T06:45:23.891325Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-27T06:45:23.892983Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiv4nts29r5c kind= uid=9a2b6738-9168-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:45:23.893187Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmiv4nts29r5c kind= uid=9a2b6738-9168-11e8-86a4-525500d15501 msg="No update processing required" level=info timestamp=2018-07-27T06:45:23.910720Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiv4nts29r5c kind= uid=9a2b6738-9168-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:45:23.910790Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmiv4nts29r5c kind= uid=9a2b6738-9168-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:45:23.915558Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiv4nts29r5c kind= uid=9a2b6738-9168-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-n5fk6 Pod phase: Running level=info timestamp=2018-07-27T06:26:59.360465Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-07-27T06:26:59.368448Z pos=vm.go:210 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-07-27T06:26:59.370331Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-07-27T06:26:59.469715Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T06:26:59.617851Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T06:26:59.630449Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: netcat26k95 Pod phase: Succeeded ++ head -n 1 +++ nc -ul 28016 +++ echo +++ nc -up 28016 10.102.10.201 28017 -i 1 -w 1 + x='Hello UDP World!' + echo 'Hello UDP World!' + '[' 'Hello UDP World!' = 'Hello UDP World!' ']' + echo succeeded + exit 0 Hello UDP World! succeeded Pod name: netcat2ng9d Pod phase: Succeeded ++ head -n 1 +++ nc 192.168.66.101 30017 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcat45l4p Pod phase: Succeeded ++ head -n 1 +++ nc -ul 31016 +++ echo +++ nc -up 31016 192.168.66.101 31017 -i 1 -w 1 + x='Hello UDP World!' + echo 'Hello UDP World!' + '[' 'Hello UDP World!' = 'Hello UDP World!' ']' + echo succeeded + exit 0 Hello UDP World! succeeded Pod name: netcatg4d6q Pod phase: Succeeded ++ head -n 1 +++ nc 192.168.66.102 30017 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatjgmfc Pod phase: Succeeded ++ head -n 1 +++ nc 10.102.76.22 27017 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatn6zjr Pod phase: Succeeded ++ head -n 1 +++ nc -ul 31016 +++ echo +++ nc -up 31016 192.168.66.102 31017 -i 1 -w 1 + x='Hello UDP World!' + echo 'Hello UDP World!' + '[' 'Hello UDP World!' = 'Hello UDP World!' ']' + echo succeeded + exit 0 Hello UDP World! succeeded Pod name: netcatqrc6z Pod phase: Succeeded ++ head -n 1 +++ nc -ul 29016 +++ echo +++ nc -up 29016 10.111.124.54 29017 -i 1 -w 1 Hello UDP World! succeeded + x='Hello UDP World!' + echo 'Hello UDP World!' + '[' 'Hello UDP World!' = 'Hello UDP World!' ']' + echo succeeded + exit 0 Pod name: virt-launcher-testvmicxdv5-tlv6b Pod phase: Running level=info timestamp=2018-07-27T06:44:18.702917Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-27T06:44:19.241547Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:44:19.247906Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:44:19.249598Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID 8f3954a5-ef57-407d-a991-b82a91e5a881" level=info timestamp=2018-07-27T06:44:19.250940Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:44:19.459706Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:44:19.475956Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:44:19.477731Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:44:19.483957Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:44:19.506973Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmicxdv5 kind= uid=73d9006c-9168-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:44:19.509708Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmicxdv5 kind= uid=73d9006c-9168-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:44:19.510440Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:44:19.514044Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:44:19.528488Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmicxdv5 kind= uid=73d9006c-9168-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:44:20.254239Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 8f3954a5-ef57-407d-a991-b82a91e5a881: 179" Pod name: virt-launcher-testvmilnhpj-jt2rf Pod phase: Running level=info timestamp=2018-07-27T06:43:15.288445Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-27T06:43:15.762722Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:43:15.772103Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:43:15.777881Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID 42129890-1dd1-4936-8bbd-09cf61b166a7" level=info timestamp=2018-07-27T06:43:15.778079Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:43:16.036065Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:43:16.058290Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:43:16.060343Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:43:16.071017Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:43:16.090495Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmilnhpj kind= uid=4df1a4b6-9168-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:43:16.092903Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmilnhpj kind= uid=4df1a4b6-9168-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:43:16.099171Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:43:16.100708Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:43:16.131066Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmilnhpj kind= uid=4df1a4b6-9168-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:43:16.781402Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 42129890-1dd1-4936-8bbd-09cf61b166a7: 180" Pod name: virt-launcher-testvmingh7p-s7r98 Pod phase: Pending level=info timestamp=2018-07-27T06:43:05.774926Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T06:43:05.775696Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T06:43:05.777239Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T06:43:15.785193Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T06:43:15.835526Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmingh7p" level=info timestamp=2018-07-27T06:43:15.838234Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T06:43:15.838419Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiv4nts29r5c-6f2jw Pod phase: Running level=info timestamp=2018-07-27T06:45:23.048657Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-27T06:45:23.547243Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:45:23.554583Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:45:23.689112Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID d18f904d-9b07-410a-8e46-48194ec30ea7" level=info timestamp=2018-07-27T06:45:23.689425Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:45:23.811750Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:45:23.836285Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:45:23.840715Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:45:23.842480Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:45:23.886296Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmiv4nts29r5c kind= uid=9a2b6738-9168-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:45:23.889348Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiv4nts29r5c kind= uid=9a2b6738-9168-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:45:23.890027Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:45:23.894981Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:45:23.915384Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiv4nts29r5c kind= uid=9a2b6738-9168-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:45:24.695273Z pos=monitor.go:222 component=virt-launcher msg="Found PID for d18f904d-9b07-410a-8e46-48194ec30ea7: 179" Pod name: virt-launcher-testvmiv4ntsphlqc-sbzwx Pod phase: Running level=info timestamp=2018-07-27T06:45:13.393540Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T06:45:13.393839Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T06:45:13.397193Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T06:45:23.405512Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T06:45:23.427016Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiv4ntsphlqc" level=info timestamp=2018-07-27T06:45:23.429839Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T06:45:23.430002Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [122.621 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on a VMI replica set /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:253 Expose ClusterIP service [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:286 Should create a ClusterIP service on VMRS and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:290 Timed out after 120.000s. Expected : 1 to equal : 2 /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:271 ------------------------------ STEP: Creating a VMRS object with 2 replicas STEP: Start the replica set STEP: Checking the number of ready replicas Service cluster-ip-vm successfully exposed for virtualmachine testvmi4pknb VM testvmi4pknb was scheduled to start 2018/07/27 02:48:04 read closing down: EOF • [SLOW TEST:58.374 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on an VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:318 Expose ClusterIP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:362 Connect to ClusterIP services that was set when VM was offline /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:363 ------------------------------ 2018/07/27 02:49:50 read closing down: EOF 2018/07/27 02:50:00 read closing down: EOF 2018/07/27 02:50:10 read closing down: EOF 2018/07/27 02:50:21 read closing down: EOF 2018/07/27 02:50:21 read closing down: EOF 2018/07/27 02:50:22 read closing down: EOF • [SLOW TEST:134.745 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be able to reach /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 the Inbound VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ 2018/07/27 02:50:23 read closing down: EOF 2018/07/27 02:50:23 read closing down: EOF 2018/07/27 02:50:25 read closing down: EOF 2018/07/27 02:50:25 read closing down: EOF •2018/07/27 02:50:25 read closing down: EOF 2018/07/27 02:50:27 read closing down: EOF •2018/07/27 02:50:27 read closing down: EOF 2018/07/27 02:50:27 read closing down: EOF 2018/07/27 02:50:29 read closing down: EOF •2018/07/27 02:50:29 read closing down: EOF 2018/07/27 02:50:29 read closing down: EOF •••• ------------------------------ • [SLOW TEST:5.158 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 with a service matching the vmi exposed /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:283 should be able to reach the vmi based on labels specified on the vmi /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:303 ------------------------------ • ------------------------------ • [SLOW TEST:6.102 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 with a subdomain and a headless service given /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:330 should be able to reach the vmi via its unique fully qualified domain name /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:353 ------------------------------ Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running level=info timestamp=2018-07-27T06:52:28.197830Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 06:52:28 http: TLS handshake error from 10.244.0.1:33162: EOF 2018/07/27 06:52:38 http: TLS handshake error from 10.244.0.1:33186: EOF 2018/07/27 06:52:48 http: TLS handshake error from 10.244.0.1:33210: EOF level=info timestamp=2018-07-27T06:52:56.658215Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T06:52:58.230963Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 06:52:58 http: TLS handshake error from 10.244.0.1:33234: EOF 2018/07/27 06:53:08 http: TLS handshake error from 10.244.0.1:33258: EOF 2018/07/27 06:53:18 http: TLS handshake error from 10.244.0.1:33282: EOF level=info timestamp=2018-07-27T06:53:26.715606Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T06:53:28.263918Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 06:53:28 http: TLS handshake error from 10.244.0.1:33306: EOF 2018/07/27 06:53:38 http: TLS handshake error from 10.244.0.1:33330: EOF 2018/07/27 06:53:48 http: TLS handshake error from 10.244.0.1:33354: EOF level=info timestamp=2018-07-27T06:53:56.741342Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-zplcq Pod phase: Running level=info timestamp=2018-07-27T06:52:04.944818Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 06:52:09 http: TLS handshake error from 10.244.1.1:47530: EOF 2018/07/27 06:52:19 http: TLS handshake error from 10.244.1.1:47536: EOF 2018/07/27 06:52:29 http: TLS handshake error from 10.244.1.1:47542: EOF level=info timestamp=2018-07-27T06:52:34.950017Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 06:52:39 http: TLS handshake error from 10.244.1.1:47548: EOF 2018/07/27 06:52:49 http: TLS handshake error from 10.244.1.1:47554: EOF 2018/07/27 06:52:59 http: TLS handshake error from 10.244.1.1:47560: EOF level=info timestamp=2018-07-27T06:53:04.959505Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 06:53:09 http: TLS handshake error from 10.244.1.1:47566: EOF 2018/07/27 06:53:19 http: TLS handshake error from 10.244.1.1:47572: EOF 2018/07/27 06:53:29 http: TLS handshake error from 10.244.1.1:47578: EOF level=info timestamp=2018-07-27T06:53:34.954797Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 06:53:39 http: TLS handshake error from 10.244.1.1:47584: EOF 2018/07/27 06:53:49 http: TLS handshake error from 10.244.1.1:47590: EOF Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T06:48:10.434533Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:48:10.434646Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:48:10.446882Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:48:10.447014Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:48:10.483285Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:48:10.483406Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:48:10.503071Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:48:10.503194Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:48:10.515965Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmicbkcn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmicbkcn" level=info timestamp=2018-07-27T06:48:10.516487Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivstms\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivstms" level=info timestamp=2018-07-27T06:48:10.641069Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmicbkcn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmicbkcn" level=info timestamp=2018-07-27T06:48:11.041550Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijm75h\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijm75h" level=info timestamp=2018-07-27T06:48:11.842571Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijm75h\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijm75h" level=info timestamp=2018-07-27T06:50:57.442588Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin6shn kind= uid=6a3541a9-9169-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:50:57.447469Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin6shn kind= uid=6a3541a9-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2qczc Pod phase: Running level=info timestamp=2018-07-27T06:48:45.137374Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:48:45.318754Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:45.318893Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:48:45.406317Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:45.505474Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:47.153589Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type ADDED" level=info timestamp=2018-07-27T06:48:47.439796Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind=Domain uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-27T06:48:47.947138Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-27T06:48:47.969896Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind=Domain uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-27T06:48:47.977066Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:47.977164Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="No update processing required" level=info timestamp=2018-07-27T06:48:47.978573Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-27T06:48:48.001611Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:48.001725Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:48:48.117044Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-n5fk6 Pod phase: Running level=info timestamp=2018-07-27T06:26:59.360465Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-07-27T06:26:59.368448Z pos=vm.go:210 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-07-27T06:26:59.370331Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-07-27T06:26:59.469715Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T06:26:59.617851Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T06:26:59.630449Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: netcat4rh5k Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcat777s7 Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 Hello World! succeeded + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Pod name: netcat8mqbg Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatc56qq Pod phase: Succeeded ++ head -n 1 +++ nc my-subdomain.myvmi.kubevirt-test-default 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatfrghk Pod phase: Failed ++ head -n 1 +++ nc wrongservice.kubevirt-test-default 1500 -i 1 -w 1 Ncat: Could not resolve hostname "wrongservice.kubevirt-test-default": Name or service not known. QUITTING. + x= + echo '' + '[' '' = 'Hello World!' ']' + echo failed + exit 1 failed Pod name: netcatth4dw Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatztf2x Pod phase: Succeeded ++ head -n 1 +++ nc myservice.kubevirt-test-default 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: virt-launcher-testvmicbkcn-rl25n Pod phase: Running level=info timestamp=2018-07-27T06:48:36.062275Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:36.110592Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID 80b82610-4496-4a92-8049-072fe071746d" level=info timestamp=2018-07-27T06:48:36.111109Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:36.129980Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:36.666130Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:36.793616Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:36.794495Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:36.795703Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:37.115178Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 80b82610-4496-4a92-8049-072fe071746d: 189" level=info timestamp=2018-07-27T06:48:37.159328Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.159485Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.192200Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:37.200016Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:45.111329Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:45.152257Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmijm75h-2267s Pod phase: Running level=info timestamp=2018-07-27T06:48:35.006275Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-27T06:48:36.046786Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:36.052825Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID 7f2def96-be25-4e50-a04e-3aa88e1165b4" level=info timestamp=2018-07-27T06:48:36.053057Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:36.420360Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:36.843101Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:36.935337Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:36.957422Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:36.970203Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:37.063080Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 7f2def96-be25-4e50-a04e-3aa88e1165b4: 185" level=info timestamp=2018-07-27T06:48:37.149246Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.149364Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.185553Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:37.209800Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:45.125581Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmin6shn-szs6g Pod phase: Running level=info timestamp=2018-07-27T06:51:01.983525Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T06:51:01.983869Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T06:51:01.986354Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T06:51:11.992400Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T06:51:12.014061Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmin6shn" level=info timestamp=2018-07-27T06:51:12.015391Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T06:51:12.015533Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmirsm5m-9zqql Pod phase: Running level=info timestamp=2018-07-27T06:48:45.599961Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-27T06:48:46.754023Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:46.785270Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID f2662d6c-9524-49ec-8b56-913691ef675d" level=info timestamp=2018-07-27T06:48:46.785520Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:47.392485Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:47.469585Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:47.510423Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:47.547092Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:47.548248Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:47.791739Z pos=monitor.go:222 component=virt-launcher msg="Found PID for f2662d6c-9524-49ec-8b56-913691ef675d: 201" level=info timestamp=2018-07-27T06:48:47.947695Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:47.947868Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:47.966450Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:47.981992Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:48.008006Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmivstms-btzm8 Pod phase: Running level=info timestamp=2018-07-27T06:48:37.215221Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:37.232621Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID 85dc0d94-236b-4aac-b723-1cc2490b8608" level=info timestamp=2018-07-27T06:48:37.232868Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:37.399073Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.618701Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.707556Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:37.729116Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:37.739525Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:37.944857Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.945026Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.980142Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:38.243446Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 85dc0d94-236b-4aac-b723-1cc2490b8608: 187" level=info timestamp=2018-07-27T06:48:39.650945Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:45.282051Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:45.323623Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synced vmi" • Failure [182.449 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with custom interface model /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:379 should expose the right device type to the guest [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:380 Timed out after 90.005s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1049 ------------------------------ STEP: checking the device vendor in /sys/class level=info timestamp=2018-07-27T06:50:57.480979Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmin6shn-szs6g" 2018/07/27 02:54:00 read closing down: EOF 2018/07/27 02:54:00 read closing down: EOF • Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running 2018/07/27 06:55:38 http: TLS handshake error from 10.244.0.1:33632: EOF 2018/07/27 06:55:48 http: TLS handshake error from 10.244.0.1:33656: EOF level=info timestamp=2018-07-27T06:55:56.868727Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T06:55:58.421520Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 06:55:58 http: TLS handshake error from 10.244.0.1:33680: EOF 2018/07/27 06:56:08 http: TLS handshake error from 10.244.0.1:33704: EOF 2018/07/27 06:56:18 http: TLS handshake error from 10.244.0.1:33728: EOF level=info timestamp=2018-07-27T06:56:26.898825Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T06:56:28.448368Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 06:56:28 http: TLS handshake error from 10.244.0.1:33752: EOF 2018/07/27 06:56:38 http: TLS handshake error from 10.244.0.1:33776: EOF 2018/07/27 06:56:48 http: TLS handshake error from 10.244.0.1:33800: EOF level=info timestamp=2018-07-27T06:56:56.927839Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T06:56:58.487183Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 06:56:58 http: TLS handshake error from 10.244.0.1:33824: EOF Pod name: virt-api-7d79764579-zplcq Pod phase: Running 2018/07/27 06:55:09 http: TLS handshake error from 10.244.1.1:47640: EOF 2018/07/27 06:55:19 http: TLS handshake error from 10.244.1.1:47646: EOF 2018/07/27 06:55:29 http: TLS handshake error from 10.244.1.1:47652: EOF level=info timestamp=2018-07-27T06:55:34.966915Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 06:55:39 http: TLS handshake error from 10.244.1.1:47658: EOF 2018/07/27 06:55:49 http: TLS handshake error from 10.244.1.1:47664: EOF 2018/07/27 06:55:59 http: TLS handshake error from 10.244.1.1:47670: EOF level=info timestamp=2018-07-27T06:56:04.961989Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 06:56:09 http: TLS handshake error from 10.244.1.1:47676: EOF 2018/07/27 06:56:19 http: TLS handshake error from 10.244.1.1:47682: EOF 2018/07/27 06:56:29 http: TLS handshake error from 10.244.1.1:47688: EOF level=info timestamp=2018-07-27T06:56:34.978715Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 06:56:39 http: TLS handshake error from 10.244.1.1:47694: EOF 2018/07/27 06:56:49 http: TLS handshake error from 10.244.1.1:47700: EOF 2018/07/27 06:56:59 http: TLS handshake error from 10.244.1.1:47706: EOF Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T06:48:10.446882Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:48:10.447014Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:48:10.483285Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:48:10.483406Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:48:10.503071Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:48:10.503194Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:48:10.515965Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmicbkcn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmicbkcn" level=info timestamp=2018-07-27T06:48:10.516487Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivstms\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivstms" level=info timestamp=2018-07-27T06:48:10.641069Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmicbkcn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmicbkcn" level=info timestamp=2018-07-27T06:48:11.041550Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijm75h\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijm75h" level=info timestamp=2018-07-27T06:48:11.842571Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijm75h\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijm75h" level=info timestamp=2018-07-27T06:50:57.442588Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin6shn kind= uid=6a3541a9-9169-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:50:57.447469Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin6shn kind= uid=6a3541a9-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:54:00.926951Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmithtwm kind= uid=d7931832-9169-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:54:00.927560Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmithtwm kind= uid=d7931832-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2qczc Pod phase: Running level=info timestamp=2018-07-27T06:48:45.137374Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:48:45.318754Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:45.318893Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:48:45.406317Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:45.505474Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:47.153589Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type ADDED" level=info timestamp=2018-07-27T06:48:47.439796Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind=Domain uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-27T06:48:47.947138Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-27T06:48:47.969896Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind=Domain uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-27T06:48:47.977066Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:47.977164Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="No update processing required" level=info timestamp=2018-07-27T06:48:47.978573Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-27T06:48:48.001611Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:48.001725Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:48:48.117044Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-n5fk6 Pod phase: Running level=info timestamp=2018-07-27T06:26:59.360465Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-07-27T06:26:59.368448Z pos=vm.go:210 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-07-27T06:26:59.370331Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-07-27T06:26:59.469715Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T06:26:59.617851Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T06:26:59.630449Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: netcat4rh5k Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcat777s7 Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 Hello World! succeeded + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Pod name: netcat8mqbg Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatc56qq Pod phase: Succeeded ++ head -n 1 +++ nc my-subdomain.myvmi.kubevirt-test-default 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatfrghk Pod phase: Failed ++ head -n 1 +++ nc wrongservice.kubevirt-test-default 1500 -i 1 -w 1 Ncat: Could not resolve hostname "wrongservice.kubevirt-test-default": Name or service not known. QUITTING. + x= + echo '' + '[' '' = 'Hello World!' ']' + echo failed + exit 1 failed Pod name: netcatth4dw Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatztf2x Pod phase: Succeeded ++ head -n 1 +++ nc myservice.kubevirt-test-default 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: virt-launcher-testvmicbkcn-rl25n Pod phase: Running level=info timestamp=2018-07-27T06:48:36.062275Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:36.110592Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID 80b82610-4496-4a92-8049-072fe071746d" level=info timestamp=2018-07-27T06:48:36.111109Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:36.129980Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:36.666130Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:36.793616Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:36.794495Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:36.795703Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:37.115178Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 80b82610-4496-4a92-8049-072fe071746d: 189" level=info timestamp=2018-07-27T06:48:37.159328Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.159485Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.192200Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:37.200016Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:45.111329Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:45.152257Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmijm75h-2267s Pod phase: Running level=info timestamp=2018-07-27T06:48:35.006275Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-27T06:48:36.046786Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:36.052825Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID 7f2def96-be25-4e50-a04e-3aa88e1165b4" level=info timestamp=2018-07-27T06:48:36.053057Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:36.420360Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:36.843101Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:36.935337Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:36.957422Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:36.970203Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:37.063080Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 7f2def96-be25-4e50-a04e-3aa88e1165b4: 185" level=info timestamp=2018-07-27T06:48:37.149246Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.149364Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.185553Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:37.209800Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:45.125581Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmin6shn-szs6g Pod phase: Failed level=info timestamp=2018-07-27T06:51:01.983525Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T06:51:01.983869Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T06:51:01.986354Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T06:51:11.992400Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T06:51:12.014061Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmin6shn" level=info timestamp=2018-07-27T06:51:12.015391Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T06:51:12.015533Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1443c60, 0xc42048e0f0, 0xc4202081c0, 0xc42061f7a0) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 Pod name: virt-launcher-testvmirsm5m-9zqql Pod phase: Running level=info timestamp=2018-07-27T06:48:45.599961Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-27T06:48:46.754023Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:46.785270Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID f2662d6c-9524-49ec-8b56-913691ef675d" level=info timestamp=2018-07-27T06:48:46.785520Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:47.392485Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:47.469585Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:47.510423Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:47.547092Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:47.548248Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:47.791739Z pos=monitor.go:222 component=virt-launcher msg="Found PID for f2662d6c-9524-49ec-8b56-913691ef675d: 201" level=info timestamp=2018-07-27T06:48:47.947695Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:47.947868Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:47.966450Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:47.981992Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:48.008006Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmithtwm-wjdbr Pod phase: Running level=info timestamp=2018-07-27T06:54:04.996674Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T06:54:04.998732Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T06:54:05.008528Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T06:54:15.015983Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T06:54:15.034974Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmithtwm" level=info timestamp=2018-07-27T06:54:15.036479Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T06:54:15.036611Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivstms-btzm8 Pod phase: Running level=info timestamp=2018-07-27T06:48:37.215221Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:37.232621Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID 85dc0d94-236b-4aac-b723-1cc2490b8608" level=info timestamp=2018-07-27T06:48:37.232868Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:37.399073Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.618701Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.707556Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:37.729116Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:37.739525Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:37.944857Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.945026Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.980142Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:38.243446Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 85dc0d94-236b-4aac-b723-1cc2490b8608: 187" level=info timestamp=2018-07-27T06:48:39.650945Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:45.282051Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:45.323623Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synced vmi" ------------------------------ • Failure [182.655 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with custom MAC address /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:413 should configure custom MAC address [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:414 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1049 ------------------------------ STEP: checking eth0 MAC address level=info timestamp=2018-07-27T06:54:01.050686Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmithtwm-wjdbr" Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running level=info timestamp=2018-07-27T06:58:57.046174Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T06:58:58.654270Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 06:58:58 http: TLS handshake error from 10.244.0.1:34112: EOF 2018/07/27 06:59:08 http: TLS handshake error from 10.244.0.1:34136: EOF level=info timestamp=2018-07-27T06:59:10.340101Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-27T06:59:10.340968Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 06:59:18 http: TLS handshake error from 10.244.0.1:34160: EOF level=info timestamp=2018-07-27T06:59:27.075997Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T06:59:28.680300Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 06:59:28 http: TLS handshake error from 10.244.0.1:34184: EOF 2018/07/27 06:59:38 http: TLS handshake error from 10.244.0.1:34208: EOF 2018/07/27 06:59:48 http: TLS handshake error from 10.244.0.1:34232: EOF level=info timestamp=2018-07-27T06:59:57.108621Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 06:59:58 http: TLS handshake error from 10.244.0.1:34256: EOF level=info timestamp=2018-07-27T06:59:58.708205Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-zplcq Pod phase: Running 2018/07/27 06:58:29 http: TLS handshake error from 10.244.1.1:47760: EOF level=info timestamp=2018-07-27T06:58:34.949826Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 06:58:39 http: TLS handshake error from 10.244.1.1:47766: EOF 2018/07/27 06:58:49 http: TLS handshake error from 10.244.1.1:47772: EOF 2018/07/27 06:58:59 http: TLS handshake error from 10.244.1.1:47778: EOF level=info timestamp=2018-07-27T06:59:04.862302Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-27T06:59:04.887511Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-27T06:59:04.965715Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 06:59:09 http: TLS handshake error from 10.244.1.1:47784: EOF 2018/07/27 06:59:19 http: TLS handshake error from 10.244.1.1:47790: EOF 2018/07/27 06:59:29 http: TLS handshake error from 10.244.1.1:47796: EOF level=info timestamp=2018-07-27T06:59:34.960007Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 06:59:39 http: TLS handshake error from 10.244.1.1:47802: EOF 2018/07/27 06:59:49 http: TLS handshake error from 10.244.1.1:47808: EOF 2018/07/27 06:59:59 http: TLS handshake error from 10.244.1.1:47814: EOF Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T06:48:10.483406Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:48:10.503071Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:48:10.503194Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:48:10.515965Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmicbkcn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmicbkcn" level=info timestamp=2018-07-27T06:48:10.516487Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivstms\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivstms" level=info timestamp=2018-07-27T06:48:10.641069Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmicbkcn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmicbkcn" level=info timestamp=2018-07-27T06:48:11.041550Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijm75h\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijm75h" level=info timestamp=2018-07-27T06:48:11.842571Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijm75h\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijm75h" level=info timestamp=2018-07-27T06:50:57.442588Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin6shn kind= uid=6a3541a9-9169-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:50:57.447469Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin6shn kind= uid=6a3541a9-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:54:00.926951Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmithtwm kind= uid=d7931832-9169-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:54:00.927560Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmithtwm kind= uid=d7931832-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:57:03.579219Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmib9dg5 kind= uid=44720e56-916a-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:57:03.583071Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmib9dg5 kind= uid=44720e56-916a-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:57:03.844620Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmib9dg5\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmib9dg5" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2qczc Pod phase: Running level=info timestamp=2018-07-27T06:48:45.137374Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:48:45.318754Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:45.318893Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:48:45.406317Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:45.505474Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:47.153589Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type ADDED" level=info timestamp=2018-07-27T06:48:47.439796Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind=Domain uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-27T06:48:47.947138Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-27T06:48:47.969896Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind=Domain uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-27T06:48:47.977066Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:47.977164Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="No update processing required" level=info timestamp=2018-07-27T06:48:47.978573Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-27T06:48:48.001611Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:48.001725Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:48:48.117044Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-n5fk6 Pod phase: Running level=info timestamp=2018-07-27T06:26:59.360465Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-07-27T06:26:59.368448Z pos=vm.go:210 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-07-27T06:26:59.370331Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-07-27T06:26:59.469715Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T06:26:59.617851Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T06:26:59.630449Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: netcat4rh5k Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcat777s7 Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 Hello World! succeeded + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Pod name: netcat8mqbg Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatc56qq Pod phase: Succeeded ++ head -n 1 +++ nc my-subdomain.myvmi.kubevirt-test-default 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatfrghk Pod phase: Failed ++ head -n 1 +++ nc wrongservice.kubevirt-test-default 1500 -i 1 -w 1 Ncat: Could not resolve hostname "wrongservice.kubevirt-test-default": Name or service not known. QUITTING. + x= + echo '' + '[' '' = 'Hello World!' ']' + echo failed + exit 1 failed Pod name: netcatth4dw Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatztf2x Pod phase: Succeeded ++ head -n 1 +++ nc myservice.kubevirt-test-default 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: virt-launcher-testvmib9dg5-j79t4 Pod phase: Running level=info timestamp=2018-07-27T06:57:10.808161Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T06:57:10.808627Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T06:57:10.810424Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T06:57:20.822946Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T06:57:20.865795Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmib9dg5" level=info timestamp=2018-07-27T06:57:20.867626Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T06:57:20.867914Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmicbkcn-rl25n Pod phase: Running level=info timestamp=2018-07-27T06:48:36.062275Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:36.110592Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID 80b82610-4496-4a92-8049-072fe071746d" level=info timestamp=2018-07-27T06:48:36.111109Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:36.129980Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:36.666130Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:36.793616Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:36.794495Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:36.795703Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:37.115178Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 80b82610-4496-4a92-8049-072fe071746d: 189" level=info timestamp=2018-07-27T06:48:37.159328Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.159485Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.192200Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:37.200016Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:45.111329Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:45.152257Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmijm75h-2267s Pod phase: Running level=info timestamp=2018-07-27T06:48:35.006275Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-27T06:48:36.046786Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:36.052825Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID 7f2def96-be25-4e50-a04e-3aa88e1165b4" level=info timestamp=2018-07-27T06:48:36.053057Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:36.420360Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:36.843101Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:36.935337Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:36.957422Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:36.970203Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:37.063080Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 7f2def96-be25-4e50-a04e-3aa88e1165b4: 185" level=info timestamp=2018-07-27T06:48:37.149246Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.149364Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.185553Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:37.209800Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:45.125581Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmin6shn-szs6g Pod phase: Failed level=info timestamp=2018-07-27T06:51:01.983525Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T06:51:01.983869Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T06:51:01.986354Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T06:51:11.992400Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T06:51:12.014061Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmin6shn" level=info timestamp=2018-07-27T06:51:12.015391Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T06:51:12.015533Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1443c60, 0xc42048e0f0, 0xc4202081c0, 0xc42061f7a0) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 Pod name: virt-launcher-testvmirsm5m-9zqql Pod phase: Running level=info timestamp=2018-07-27T06:48:45.599961Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-27T06:48:46.754023Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:46.785270Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID f2662d6c-9524-49ec-8b56-913691ef675d" level=info timestamp=2018-07-27T06:48:46.785520Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:47.392485Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:47.469585Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:47.510423Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:47.547092Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:47.548248Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:47.791739Z pos=monitor.go:222 component=virt-launcher msg="Found PID for f2662d6c-9524-49ec-8b56-913691ef675d: 201" level=info timestamp=2018-07-27T06:48:47.947695Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:47.947868Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:47.966450Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:47.981992Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:48.008006Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmithtwm-wjdbr Pod phase: Failed level=info timestamp=2018-07-27T06:54:04.996674Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T06:54:04.998732Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T06:54:05.008528Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T06:54:15.015983Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T06:54:15.034974Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmithtwm" level=info timestamp=2018-07-27T06:54:15.036479Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T06:54:15.036611Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1443c60, 0xc420430550, 0xc420204230, 0xc4205e3980) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 Pod name: virt-launcher-testvmivstms-btzm8 Pod phase: Running level=info timestamp=2018-07-27T06:48:37.215221Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:37.232621Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID 85dc0d94-236b-4aac-b723-1cc2490b8608" level=info timestamp=2018-07-27T06:48:37.232868Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:37.399073Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.618701Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.707556Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:37.729116Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:37.739525Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:37.944857Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.945026Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.980142Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:38.243446Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 85dc0d94-236b-4aac-b723-1cc2490b8608: 187" level=info timestamp=2018-07-27T06:48:39.650945Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:45.282051Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:45.323623Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synced vmi" • Failure [182.845 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with custom MAC address in non-conventional format /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:425 should configure custom MAC address [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:426 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1049 ------------------------------ STEP: checking eth0 MAC address level=info timestamp=2018-07-27T06:57:03.625055Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmib9dg5-j79t4" Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running level=info timestamp=2018-07-27T07:01:57.217894Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:01:58 http: TLS handshake error from 10.244.0.1:34544: EOF level=info timestamp=2018-07-27T07:01:58.810947Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:02:08 http: TLS handshake error from 10.244.0.1:34568: EOF level=info timestamp=2018-07-27T07:02:10.346054Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-27T07:02:10.347040Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:02:18 http: TLS handshake error from 10.244.0.1:34592: EOF level=info timestamp=2018-07-27T07:02:27.249195Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:02:28 http: TLS handshake error from 10.244.0.1:34616: EOF level=info timestamp=2018-07-27T07:02:28.837095Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:02:38 http: TLS handshake error from 10.244.0.1:34640: EOF 2018/07/27 07:02:48 http: TLS handshake error from 10.244.0.1:34664: EOF level=info timestamp=2018-07-27T07:02:57.279616Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:02:58 http: TLS handshake error from 10.244.0.1:34688: EOF level=info timestamp=2018-07-27T07:02:58.870537Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-zplcq Pod phase: Running 2018/07/27 07:01:19 http: TLS handshake error from 10.244.1.1:47862: EOF 2018/07/27 07:01:29 http: TLS handshake error from 10.244.1.1:47868: EOF level=info timestamp=2018-07-27T07:01:34.972614Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:01:39 http: TLS handshake error from 10.244.1.1:47874: EOF 2018/07/27 07:01:49 http: TLS handshake error from 10.244.1.1:47880: EOF 2018/07/27 07:01:59 http: TLS handshake error from 10.244.1.1:47886: EOF level=info timestamp=2018-07-27T07:02:04.981948Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:02:09 http: TLS handshake error from 10.244.1.1:47892: EOF 2018/07/27 07:02:19 http: TLS handshake error from 10.244.1.1:47898: EOF 2018/07/27 07:02:29 http: TLS handshake error from 10.244.1.1:47904: EOF level=info timestamp=2018-07-27T07:02:34.952490Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:02:39 http: TLS handshake error from 10.244.1.1:47910: EOF 2018/07/27 07:02:49 http: TLS handshake error from 10.244.1.1:47916: EOF 2018/07/27 07:02:59 http: TLS handshake error from 10.244.1.1:47922: EOF level=info timestamp=2018-07-27T07:03:04.968060Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T06:48:10.503194Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:48:10.515965Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmicbkcn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmicbkcn" level=info timestamp=2018-07-27T06:48:10.516487Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivstms\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivstms" level=info timestamp=2018-07-27T06:48:10.641069Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmicbkcn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmicbkcn" level=info timestamp=2018-07-27T06:48:11.041550Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijm75h\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijm75h" level=info timestamp=2018-07-27T06:48:11.842571Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijm75h\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijm75h" level=info timestamp=2018-07-27T06:50:57.442588Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin6shn kind= uid=6a3541a9-9169-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:50:57.447469Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin6shn kind= uid=6a3541a9-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:54:00.926951Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmithtwm kind= uid=d7931832-9169-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:54:00.927560Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmithtwm kind= uid=d7931832-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:57:03.579219Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmib9dg5 kind= uid=44720e56-916a-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:57:03.583071Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmib9dg5 kind= uid=44720e56-916a-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:57:03.844620Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmib9dg5\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmib9dg5" level=info timestamp=2018-07-27T07:00:06.424284Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmif4k2c kind= uid=b16e12b7-916a-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:00:06.425532Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmif4k2c kind= uid=b16e12b7-916a-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2qczc Pod phase: Running level=info timestamp=2018-07-27T06:48:45.137374Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:48:45.318754Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:45.318893Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:48:45.406317Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:45.505474Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:47.153589Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type ADDED" level=info timestamp=2018-07-27T06:48:47.439796Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind=Domain uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-27T06:48:47.947138Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-27T06:48:47.969896Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind=Domain uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-27T06:48:47.977066Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:47.977164Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="No update processing required" level=info timestamp=2018-07-27T06:48:47.978573Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-27T06:48:48.001611Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:48.001725Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:48:48.117044Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-n5fk6 Pod phase: Running level=info timestamp=2018-07-27T06:26:59.360465Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-07-27T06:26:59.368448Z pos=vm.go:210 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-07-27T06:26:59.370331Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-07-27T06:26:59.469715Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T06:26:59.617851Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T06:26:59.630449Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: netcat4rh5k Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcat777s7 Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 Hello World! succeeded + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Pod name: netcat8mqbg Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatc56qq Pod phase: Succeeded ++ head -n 1 +++ nc my-subdomain.myvmi.kubevirt-test-default 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatfrghk Pod phase: Failed ++ head -n 1 +++ nc wrongservice.kubevirt-test-default 1500 -i 1 -w 1 Ncat: Could not resolve hostname "wrongservice.kubevirt-test-default": Name or service not known. QUITTING. + x= + echo '' + '[' '' = 'Hello World!' ']' + echo failed + exit 1 failed Pod name: netcatth4dw Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatztf2x Pod phase: Succeeded ++ head -n 1 +++ nc myservice.kubevirt-test-default 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: virt-launcher-testvmib9dg5-j79t4 Pod phase: Failed level=info timestamp=2018-07-27T06:57:10.808161Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T06:57:10.808627Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T06:57:10.810424Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T06:57:20.822946Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T06:57:20.865795Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmib9dg5" level=info timestamp=2018-07-27T06:57:20.867626Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T06:57:20.867914Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1443c60, 0xc42027c0e0, 0xc4201f21c0, 0xc421724900) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 Pod name: virt-launcher-testvmicbkcn-rl25n Pod phase: Running level=info timestamp=2018-07-27T06:48:36.062275Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:36.110592Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID 80b82610-4496-4a92-8049-072fe071746d" level=info timestamp=2018-07-27T06:48:36.111109Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:36.129980Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:36.666130Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:36.793616Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:36.794495Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:36.795703Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:37.115178Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 80b82610-4496-4a92-8049-072fe071746d: 189" level=info timestamp=2018-07-27T06:48:37.159328Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.159485Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.192200Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:37.200016Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:45.111329Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:45.152257Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmif4k2c-p6gvr Pod phase: Running level=info timestamp=2018-07-27T07:00:10.124940Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T07:00:10.125255Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T07:00:10.127932Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T07:00:20.137973Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T07:00:20.160320Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmif4k2c" level=info timestamp=2018-07-27T07:00:20.162163Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T07:00:20.162337Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmijm75h-2267s Pod phase: Running level=info timestamp=2018-07-27T06:48:35.006275Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-27T06:48:36.046786Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:36.052825Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID 7f2def96-be25-4e50-a04e-3aa88e1165b4" level=info timestamp=2018-07-27T06:48:36.053057Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:36.420360Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:36.843101Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:36.935337Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:36.957422Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:36.970203Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:37.063080Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 7f2def96-be25-4e50-a04e-3aa88e1165b4: 185" level=info timestamp=2018-07-27T06:48:37.149246Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.149364Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.185553Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:37.209800Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:45.125581Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmin6shn-szs6g Pod phase: Failed level=info timestamp=2018-07-27T06:51:01.983525Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T06:51:01.983869Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T06:51:01.986354Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T06:51:11.992400Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T06:51:12.014061Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmin6shn" level=info timestamp=2018-07-27T06:51:12.015391Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T06:51:12.015533Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1443c60, 0xc42048e0f0, 0xc4202081c0, 0xc42061f7a0) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 Pod name: virt-launcher-testvmirsm5m-9zqql Pod phase: Running level=info timestamp=2018-07-27T06:48:45.599961Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-27T06:48:46.754023Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:46.785270Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID f2662d6c-9524-49ec-8b56-913691ef675d" level=info timestamp=2018-07-27T06:48:46.785520Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:47.392485Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:47.469585Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:47.510423Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:47.547092Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:47.548248Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:47.791739Z pos=monitor.go:222 component=virt-launcher msg="Found PID for f2662d6c-9524-49ec-8b56-913691ef675d: 201" level=info timestamp=2018-07-27T06:48:47.947695Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:47.947868Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:47.966450Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:47.981992Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:48.008006Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmithtwm-wjdbr Pod phase: Failed level=info timestamp=2018-07-27T06:54:04.996674Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T06:54:04.998732Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T06:54:05.008528Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T06:54:15.015983Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T06:54:15.034974Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmithtwm" level=info timestamp=2018-07-27T06:54:15.036479Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T06:54:15.036611Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1443c60, 0xc420430550, 0xc420204230, 0xc4205e3980) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 Pod name: virt-launcher-testvmivstms-btzm8 Pod phase: Running level=info timestamp=2018-07-27T06:48:37.215221Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:37.232621Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID 85dc0d94-236b-4aac-b723-1cc2490b8608" level=info timestamp=2018-07-27T06:48:37.232868Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:37.399073Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.618701Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.707556Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:37.729116Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:37.739525Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:37.944857Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.945026Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.980142Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:38.243446Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 85dc0d94-236b-4aac-b723-1cc2490b8608: 187" level=info timestamp=2018-07-27T06:48:39.650945Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:45.282051Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:45.323623Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synced vmi" • Failure [183.049 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with custom MAC address and slirp interface /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:438 should configure custom MAC address [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:439 Timed out after 90.005s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1049 ------------------------------ STEP: checking eth0 MAC address level=info timestamp=2018-07-27T07:00:06.483278Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmif4k2c-p6gvr" Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running 2018/07/27 07:04:58 http: TLS handshake error from 10.244.0.1:34976: EOF level=info timestamp=2018-07-27T07:04:58.980046Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:05:08 http: TLS handshake error from 10.244.0.1:35000: EOF level=info timestamp=2018-07-27T07:05:10.224072Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-27T07:05:10.225615Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:05:18 http: TLS handshake error from 10.244.0.1:35024: EOF level=info timestamp=2018-07-27T07:05:27.419637Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:05:28 http: TLS handshake error from 10.244.0.1:35048: EOF level=info timestamp=2018-07-27T07:05:29.010987Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:05:38 http: TLS handshake error from 10.244.0.1:35072: EOF 2018/07/27 07:05:48 http: TLS handshake error from 10.244.0.1:35096: EOF level=info timestamp=2018-07-27T07:05:57.450753Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:05:58 http: TLS handshake error from 10.244.0.1:35120: EOF level=info timestamp=2018-07-27T07:05:59.039473Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:06:08 http: TLS handshake error from 10.244.0.1:35144: EOF Pod name: virt-api-7d79764579-zplcq Pod phase: Running 2018/07/27 07:04:19 http: TLS handshake error from 10.244.1.1:47970: EOF 2018/07/27 07:04:29 http: TLS handshake error from 10.244.1.1:47976: EOF level=info timestamp=2018-07-27T07:04:34.958034Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:04:39 http: TLS handshake error from 10.244.1.1:47982: EOF 2018/07/27 07:04:49 http: TLS handshake error from 10.244.1.1:47988: EOF 2018/07/27 07:04:59 http: TLS handshake error from 10.244.1.1:47994: EOF level=info timestamp=2018-07-27T07:05:04.948964Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:05:09 http: TLS handshake error from 10.244.1.1:48000: EOF 2018/07/27 07:05:19 http: TLS handshake error from 10.244.1.1:48006: EOF 2018/07/27 07:05:29 http: TLS handshake error from 10.244.1.1:48012: EOF level=info timestamp=2018-07-27T07:05:34.963966Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:05:39 http: TLS handshake error from 10.244.1.1:48018: EOF 2018/07/27 07:05:49 http: TLS handshake error from 10.244.1.1:48024: EOF 2018/07/27 07:05:59 http: TLS handshake error from 10.244.1.1:48030: EOF level=info timestamp=2018-07-27T07:06:04.982191Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T06:48:10.641069Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmicbkcn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmicbkcn" level=info timestamp=2018-07-27T06:48:11.041550Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijm75h\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijm75h" level=info timestamp=2018-07-27T06:48:11.842571Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijm75h\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijm75h" level=info timestamp=2018-07-27T06:50:57.442588Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin6shn kind= uid=6a3541a9-9169-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:50:57.447469Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin6shn kind= uid=6a3541a9-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:54:00.926951Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmithtwm kind= uid=d7931832-9169-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:54:00.927560Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmithtwm kind= uid=d7931832-9169-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:57:03.579219Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmib9dg5 kind= uid=44720e56-916a-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T06:57:03.583071Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmib9dg5 kind= uid=44720e56-916a-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T06:57:03.844620Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmib9dg5\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmib9dg5" level=info timestamp=2018-07-27T07:00:06.424284Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmif4k2c kind= uid=b16e12b7-916a-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:00:06.425532Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmif4k2c kind= uid=b16e12b7-916a-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:03:09.473280Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijslsc kind= uid=1e890bd1-916b-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:03:09.475883Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijslsc kind= uid=1e890bd1-916b-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:03:09.759488Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijslsc\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijslsc" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2qczc Pod phase: Running level=info timestamp=2018-07-27T06:48:45.137374Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:48:45.318754Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:45.318893Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:48:45.406317Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:45.505474Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:47.153589Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type ADDED" level=info timestamp=2018-07-27T06:48:47.439796Z pos=vm.go:657 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind=Domain uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Domain is in state Paused reason StartingUp" level=info timestamp=2018-07-27T06:48:47.947138Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-27T06:48:47.969896Z pos=vm.go:688 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind=Domain uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-27T06:48:47.977066Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:47.977164Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="No update processing required" level=info timestamp=2018-07-27T06:48:47.978573Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-27T06:48:48.001611Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T06:48:48.001725Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T06:48:48.117044Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-n5fk6 Pod phase: Running level=info timestamp=2018-07-27T06:26:59.360465Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-07-27T06:26:59.368448Z pos=vm.go:210 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-07-27T06:26:59.370331Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-07-27T06:26:59.469715Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T06:26:59.617851Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T06:26:59.630449Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: netcat4rh5k Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcat777s7 Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 Hello World! succeeded + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Pod name: netcat8mqbg Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatc56qq Pod phase: Succeeded ++ head -n 1 +++ nc my-subdomain.myvmi.kubevirt-test-default 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatfrghk Pod phase: Failed ++ head -n 1 +++ nc wrongservice.kubevirt-test-default 1500 -i 1 -w 1 Ncat: Could not resolve hostname "wrongservice.kubevirt-test-default": Name or service not known. QUITTING. + x= + echo '' + '[' '' = 'Hello World!' ']' + echo failed + exit 1 failed Pod name: netcatth4dw Pod phase: Succeeded ++ head -n 1 +++ nc 10.244.1.40 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatztf2x Pod phase: Succeeded ++ head -n 1 +++ nc myservice.kubevirt-test-default 1500 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: virt-launcher-testvmib9dg5-j79t4 Pod phase: Failed level=info timestamp=2018-07-27T06:57:10.808161Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T06:57:10.808627Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T06:57:10.810424Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T06:57:20.822946Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T06:57:20.865795Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmib9dg5" level=info timestamp=2018-07-27T06:57:20.867626Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T06:57:20.867914Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1443c60, 0xc42027c0e0, 0xc4201f21c0, 0xc421724900) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 Pod name: virt-launcher-testvmicbkcn-rl25n Pod phase: Running level=info timestamp=2018-07-27T06:48:36.062275Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:36.110592Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID 80b82610-4496-4a92-8049-072fe071746d" level=info timestamp=2018-07-27T06:48:36.111109Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:36.129980Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:36.666130Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:36.793616Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:36.794495Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:36.795703Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:37.115178Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 80b82610-4496-4a92-8049-072fe071746d: 189" level=info timestamp=2018-07-27T06:48:37.159328Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.159485Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.192200Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:37.200016Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:45.111329Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:45.152257Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmicbkcn kind= uid=06ac70b3-9169-11e8-86a4-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmif4k2c-p6gvr Pod phase: Failed level=info timestamp=2018-07-27T07:00:10.124940Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T07:00:10.125255Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T07:00:10.127932Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T07:00:20.137973Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T07:00:20.160320Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmif4k2c" level=info timestamp=2018-07-27T07:00:20.162163Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T07:00:20.162337Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1443c60, 0xc4201fa0f0, 0xc4201fc0e0, 0xc42064d740) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 Pod name: virt-launcher-testvmijm75h-2267s Pod phase: Running level=info timestamp=2018-07-27T06:48:35.006275Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-27T06:48:36.046786Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:36.052825Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID 7f2def96-be25-4e50-a04e-3aa88e1165b4" level=info timestamp=2018-07-27T06:48:36.053057Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:36.420360Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:36.843101Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:36.935337Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:36.957422Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:36.970203Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:37.063080Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 7f2def96-be25-4e50-a04e-3aa88e1165b4: 185" level=info timestamp=2018-07-27T06:48:37.149246Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.149364Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.185553Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:37.209800Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:45.125581Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmijm75h kind= uid=06ae0d12-9169-11e8-86a4-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmijslsc-fpfwn Pod phase: Running level=info timestamp=2018-07-27T07:03:12.759304Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T07:03:12.760775Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T07:03:12.762902Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T07:03:22.772032Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T07:03:22.799306Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmijslsc" level=info timestamp=2018-07-27T07:03:22.801388Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T07:03:22.801631Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmin6shn-szs6g Pod phase: Failed level=info timestamp=2018-07-27T06:51:01.983525Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T06:51:01.983869Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T06:51:01.986354Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T06:51:11.992400Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T06:51:12.014061Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmin6shn" level=info timestamp=2018-07-27T06:51:12.015391Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T06:51:12.015533Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1443c60, 0xc42048e0f0, 0xc4202081c0, 0xc42061f7a0) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 Pod name: virt-launcher-testvmirsm5m-9zqql Pod phase: Running level=info timestamp=2018-07-27T06:48:45.599961Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-27T06:48:46.754023Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:46.785270Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID f2662d6c-9524-49ec-8b56-913691ef675d" level=info timestamp=2018-07-27T06:48:46.785520Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:47.392485Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:47.469585Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:47.510423Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:47.547092Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:47.548248Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:47.791739Z pos=monitor.go:222 component=virt-launcher msg="Found PID for f2662d6c-9524-49ec-8b56-913691ef675d: 201" level=info timestamp=2018-07-27T06:48:47.947695Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:47.947868Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:47.966450Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:47.981992Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:48.008006Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmirsm5m kind= uid=06b23b09-9169-11e8-86a4-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmithtwm-wjdbr Pod phase: Failed level=info timestamp=2018-07-27T06:54:04.996674Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T06:54:04.998732Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T06:54:05.008528Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T06:54:15.015983Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T06:54:15.034974Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmithtwm" level=info timestamp=2018-07-27T06:54:15.036479Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T06:54:15.036611Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1443c60, 0xc420430550, 0xc420204230, 0xc4205e3980) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 Pod name: virt-launcher-testvmivstms-btzm8 Pod phase: Running level=info timestamp=2018-07-27T06:48:37.215221Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-27T06:48:37.232621Z pos=virt-launcher.go:214 component=virt-launcher msg="Detected domain with UUID 85dc0d94-236b-4aac-b723-1cc2490b8608" level=info timestamp=2018-07-27T06:48:37.232868Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-27T06:48:37.399073Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.618701Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.707556Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:37.729116Z pos=manager.go:196 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Domain started." level=info timestamp=2018-07-27T06:48:37.739525Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:37.944857Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:37.945026Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-27T06:48:37.980142Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-27T06:48:38.243446Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 85dc0d94-236b-4aac-b723-1cc2490b8608: 187" level=info timestamp=2018-07-27T06:48:39.650945Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-27T06:48:45.282051Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-27T06:48:45.323623Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmivstms kind= uid=06aa9b51-9169-11e8-86a4-525500d15501 msg="Synced vmi" • Failure [183.242 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with disabled automatic attachment of interfaces /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:451 should not configure any external interfaces [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:452 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1049 ------------------------------ STEP: checking loopback is the only guest interface level=info timestamp=2018-07-27T07:03:09.599022Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmijslsc-fpfwn" • [SLOW TEST:17.772 seconds] VNC /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:54 with VNC connection /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:62 should allow accessing the VNC device /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:64 ------------------------------ •• 2018/07/27 03:07:19 read closing down: EOF ------------------------------ • [SLOW TEST:48.923 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 with Alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:71 should be successfully started /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 with Disk PVC /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ 2018/07/27 03:08:08 read closing down: EOF • [SLOW TEST:49.427 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 with Alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:71 should be successfully started /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 with CDRom PVC /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ 2018/07/27 03:10:19 read closing down: EOF • [SLOW TEST:134.464 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 with Alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:71 should be successfully started and stopped multiple times /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 with Disk PVC /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ 2018/07/27 03:12:31 read closing down: EOF • [SLOW TEST:130.977 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 with Alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:71 should be successfully started and stopped multiple times /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 with CDRom PVC /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:50.054 seconds] 2018/07/27 03:13:24 read closing down: EOF Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With an emptyDisk defined /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:113 should create a writeable emptyDisk with the right capacity /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:115 ------------------------------ • [SLOW TEST:51.467 seconds] 2018/07/27 03:14:15 read closing down: EOF Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With an emptyDisk defined and a specified serial number /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:163 should create a writeable emptyDisk with the specified serial number /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:165 ------------------------------ 2018/07/27 03:15:04 read closing down: EOF • [SLOW TEST:48.959 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With ephemeral alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:205 should be successfully started /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:207 ------------------------------ 2018/07/27 03:16:50 read closing down: EOF • [SLOW TEST:105.466 seconds] Storage 2018/07/27 03:16:50 read closing down: EOF /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With ephemeral alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:205 should not persist data /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:218 ------------------------------ 2018/07/27 03:19:01 read closing down: EOF • [SLOW TEST:131.636 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With VirtualMachineInstance with two PVCs /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:266 should start vmi multiple times /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:278 ------------------------------ • Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running 2018/07/27 07:20:38 http: TLS handshake error from 10.244.0.1:37326: EOF 2018/07/27 07:20:48 http: TLS handshake error from 10.244.0.1:37350: EOF level=info timestamp=2018-07-27T07:20:58.335723Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:20:58 http: TLS handshake error from 10.244.0.1:37374: EOF level=info timestamp=2018-07-27T07:20:59.851180Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:21:08 http: TLS handshake error from 10.244.0.1:37398: EOF 2018/07/27 07:21:18 http: TLS handshake error from 10.244.0.1:37422: EOF level=info timestamp=2018-07-27T07:21:28.373271Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:21:28 http: TLS handshake error from 10.244.0.1:37446: EOF level=info timestamp=2018-07-27T07:21:29.878255Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:21:38 http: TLS handshake error from 10.244.0.1:37470: EOF 2018/07/27 07:21:48 http: TLS handshake error from 10.244.0.1:37494: EOF level=info timestamp=2018-07-27T07:21:58.400721Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:21:58 http: TLS handshake error from 10.244.0.1:37518: EOF level=info timestamp=2018-07-27T07:21:59.911246Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-zplcq Pod phase: Running 2018/07/27 07:20:09 http: TLS handshake error from 10.244.1.1:48550: EOF 2018/07/27 07:20:19 http: TLS handshake error from 10.244.1.1:48556: EOF 2018/07/27 07:20:29 http: TLS handshake error from 10.244.1.1:48562: EOF level=info timestamp=2018-07-27T07:20:34.961066Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:20:39 http: TLS handshake error from 10.244.1.1:48568: EOF 2018/07/27 07:20:49 http: TLS handshake error from 10.244.1.1:48574: EOF 2018/07/27 07:20:59 http: TLS handshake error from 10.244.1.1:48580: EOF level=info timestamp=2018-07-27T07:21:04.974975Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:21:09 http: TLS handshake error from 10.244.1.1:48586: EOF 2018/07/27 07:21:19 http: TLS handshake error from 10.244.1.1:48592: EOF 2018/07/27 07:21:29 http: TLS handshake error from 10.244.1.1:48598: EOF level=info timestamp=2018-07-27T07:21:34.959535Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:21:39 http: TLS handshake error from 10.244.1.1:48604: EOF 2018/07/27 07:21:49 http: TLS handshake error from 10.244.1.1:48610: EOF 2018/07/27 07:21:59 http: TLS handshake error from 10.244.1.1:48616: EOF Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T07:16:50.591487Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=07f41104-916d-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:16:50.591651Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=07f41104-916d-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:16:50.673128Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwqxtm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwqxtm" level=info timestamp=2018-07-27T07:17:26.425662Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=1d51d309-916d-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:17:26.426009Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=1d51d309-916d-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:17:26.500005Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwqxtm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwqxtm" level=info timestamp=2018-07-27T07:17:26.515110Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwqxtm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwqxtm" level=info timestamp=2018-07-27T07:18:02.146789Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:18:02.146940Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:19:02.038517Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisgj8t kind= uid=564f5ff5-916d-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:19:02.038631Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisgj8t kind= uid=564f5ff5-916d-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:19:02.114167Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmisgj8t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmisgj8t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 564f5ff5-916d-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmisgj8t" level=info timestamp=2018-07-27T07:19:02.286111Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiz4lsx kind= uid=5675538a-916d-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:19:02.286223Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiz4lsx kind= uid=5675538a-916d-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:19:02.344499Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiz4lsx\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiz4lsx" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2qczc Pod phase: Running level=info timestamp=2018-07-27T07:18:51.035563Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:18:51.036126Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T07:18:51.036234Z pos=vm.go:678 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-27T07:18:51.036289Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-27T07:18:51.036312Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-27T07:18:51.038842Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T07:18:51.054584Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:18:51.054691Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:18:51.054770Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:18:51.643245Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:18:51.643426Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:18:51.643541Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:18:51.643616Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:19:01.767736Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:19:01.767918Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-n5fk6 Pod phase: Running level=info timestamp=2018-07-27T06:26:59.360465Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-07-27T06:26:59.368448Z pos=vm.go:210 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-07-27T06:26:59.370331Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-07-27T06:26:59.469715Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T06:26:59.617851Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T06:26:59.630449Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-launcher-testvmiz4lsx-xswmt Pod phase: Running level=info timestamp=2018-07-27T07:19:05.533408Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T07:19:05.533706Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T07:19:05.535065Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T07:19:15.543006Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T07:19:15.564845Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiz4lsx" level=info timestamp=2018-07-27T07:19:15.567275Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T07:19:15.567451Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure [180.470 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 should start it [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:76 Timed out after 90.005s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1049 ------------------------------ level=info timestamp=2018-07-27T07:19:02.318242Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmiz4lsx-xswmt" Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running level=info timestamp=2018-07-27T07:23:58.532738Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:23:58 http: TLS handshake error from 10.244.0.1:37806: EOF level=info timestamp=2018-07-27T07:24:00.036712Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:24:08 http: TLS handshake error from 10.244.0.1:37830: EOF level=info timestamp=2018-07-27T07:24:10.175112Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-27T07:24:10.176216Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:24:18 http: TLS handshake error from 10.244.0.1:37854: EOF level=info timestamp=2018-07-27T07:24:28.578538Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:24:28 http: TLS handshake error from 10.244.0.1:37878: EOF level=info timestamp=2018-07-27T07:24:30.065662Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:24:38 http: TLS handshake error from 10.244.0.1:37902: EOF 2018/07/27 07:24:48 http: TLS handshake error from 10.244.0.1:37926: EOF level=info timestamp=2018-07-27T07:24:58.602609Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:24:58 http: TLS handshake error from 10.244.0.1:37950: EOF level=info timestamp=2018-07-27T07:25:00.095816Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-zplcq Pod phase: Running 2018/07/27 07:23:09 http: TLS handshake error from 10.244.1.1:48658: EOF 2018/07/27 07:23:19 http: TLS handshake error from 10.244.1.1:48664: EOF 2018/07/27 07:23:29 http: TLS handshake error from 10.244.1.1:48670: EOF level=info timestamp=2018-07-27T07:23:34.956014Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:23:39 http: TLS handshake error from 10.244.1.1:48676: EOF 2018/07/27 07:23:49 http: TLS handshake error from 10.244.1.1:48682: EOF 2018/07/27 07:23:59 http: TLS handshake error from 10.244.1.1:48688: EOF level=info timestamp=2018-07-27T07:24:04.964769Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:24:09 http: TLS handshake error from 10.244.1.1:48694: EOF 2018/07/27 07:24:19 http: TLS handshake error from 10.244.1.1:48700: EOF 2018/07/27 07:24:29 http: TLS handshake error from 10.244.1.1:48706: EOF level=info timestamp=2018-07-27T07:24:34.978724Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:24:39 http: TLS handshake error from 10.244.1.1:48712: EOF 2018/07/27 07:24:49 http: TLS handshake error from 10.244.1.1:48718: EOF 2018/07/27 07:24:59 http: TLS handshake error from 10.244.1.1:48724: EOF Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T07:17:26.425662Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=1d51d309-916d-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:17:26.426009Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=1d51d309-916d-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:17:26.500005Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwqxtm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwqxtm" level=info timestamp=2018-07-27T07:17:26.515110Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwqxtm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwqxtm" level=info timestamp=2018-07-27T07:18:02.146789Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:18:02.146940Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:19:02.038517Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisgj8t kind= uid=564f5ff5-916d-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:19:02.038631Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisgj8t kind= uid=564f5ff5-916d-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:19:02.114167Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmisgj8t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmisgj8t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 564f5ff5-916d-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmisgj8t" level=info timestamp=2018-07-27T07:19:02.286111Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiz4lsx kind= uid=5675538a-916d-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:19:02.286223Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiz4lsx kind= uid=5675538a-916d-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:19:02.344499Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiz4lsx\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiz4lsx" level=info timestamp=2018-07-27T07:22:02.579541Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiz4lsx\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmiz4lsx, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 5675538a-916d-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiz4lsx" level=info timestamp=2018-07-27T07:22:02.753940Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidrzjm kind= uid=c2063d91-916d-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:22:02.754071Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidrzjm kind= uid=c2063d91-916d-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2qczc Pod phase: Running level=info timestamp=2018-07-27T07:18:51.035563Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:18:51.036126Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T07:18:51.036234Z pos=vm.go:678 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-27T07:18:51.036289Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-27T07:18:51.036312Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-27T07:18:51.038842Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T07:18:51.054584Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:18:51.054691Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:18:51.054770Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:18:51.643245Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:18:51.643426Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:18:51.643541Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:18:51.643616Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:19:01.767736Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:19:01.767918Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-n5fk6 Pod phase: Running level=info timestamp=2018-07-27T06:26:59.360465Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-07-27T06:26:59.368448Z pos=vm.go:210 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-07-27T06:26:59.370331Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-07-27T06:26:59.469715Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T06:26:59.617851Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T06:26:59.630449Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-launcher-testvmidrzjm-86726 Pod phase: Running level=info timestamp=2018-07-27T07:22:05.833830Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T07:22:05.834466Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T07:22:05.836976Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T07:22:15.843930Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T07:22:15.885697Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmidrzjm" level=info timestamp=2018-07-27T07:22:15.887629Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T07:22:15.887910Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.467 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 should attach virt-launcher to it [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:82 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1049 ------------------------------ level=info timestamp=2018-07-27T07:22:02.851004Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmidrzjm-86726" •••• Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running 2018/07/27 07:26:58 http: TLS handshake error from 10.244.0.1:38238: EOF level=info timestamp=2018-07-27T07:26:58.709191Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:27:00.203397Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:27:08 http: TLS handshake error from 10.244.0.1:38262: EOF level=info timestamp=2018-07-27T07:27:10.179944Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-27T07:27:10.180736Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:27:18 http: TLS handshake error from 10.244.0.1:38286: EOF 2018/07/27 07:27:28 http: TLS handshake error from 10.244.0.1:38310: EOF level=info timestamp=2018-07-27T07:27:28.741133Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:27:30.229634Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:27:38 http: TLS handshake error from 10.244.0.1:38334: EOF 2018/07/27 07:27:48 http: TLS handshake error from 10.244.0.1:38358: EOF 2018/07/27 07:27:58 http: TLS handshake error from 10.244.0.1:38382: EOF level=info timestamp=2018-07-27T07:27:58.765837Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:28:00.260008Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-zplcq Pod phase: Running 2018/07/27 07:26:09 http: TLS handshake error from 10.244.1.1:48766: EOF 2018/07/27 07:26:19 http: TLS handshake error from 10.244.1.1:48772: EOF 2018/07/27 07:26:29 http: TLS handshake error from 10.244.1.1:48778: EOF level=info timestamp=2018-07-27T07:26:34.994615Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:26:39 http: TLS handshake error from 10.244.1.1:48784: EOF 2018/07/27 07:26:49 http: TLS handshake error from 10.244.1.1:48790: EOF 2018/07/27 07:26:59 http: TLS handshake error from 10.244.1.1:48796: EOF level=info timestamp=2018-07-27T07:27:04.980022Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:27:09 http: TLS handshake error from 10.244.1.1:48802: EOF 2018/07/27 07:27:19 http: TLS handshake error from 10.244.1.1:48808: EOF 2018/07/27 07:27:29 http: TLS handshake error from 10.244.1.1:48814: EOF level=info timestamp=2018-07-27T07:27:34.962524Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:27:39 http: TLS handshake error from 10.244.1.1:48820: EOF 2018/07/27 07:27:49 http: TLS handshake error from 10.244.1.1:48826: EOF 2018/07/27 07:27:59 http: TLS handshake error from 10.244.1.1:48832: EOF Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T07:19:02.286111Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiz4lsx kind= uid=5675538a-916d-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:19:02.286223Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiz4lsx kind= uid=5675538a-916d-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:19:02.344499Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiz4lsx\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiz4lsx" level=info timestamp=2018-07-27T07:22:02.579541Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiz4lsx\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmiz4lsx, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 5675538a-916d-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiz4lsx" level=info timestamp=2018-07-27T07:22:02.753940Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidrzjm kind= uid=c2063d91-916d-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:22:02.754071Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidrzjm kind= uid=c2063d91-916d-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:25:03.037192Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidrzjm\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmidrzjm, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: c2063d91-916d-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidrzjm" level=info timestamp=2018-07-27T07:25:03.365168Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6nsbn kind= uid=2dad6b7e-916e-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:25:03.365337Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6nsbn kind= uid=2dad6b7e-916e-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:25:03.416335Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6nsbn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6nsbn" level=info timestamp=2018-07-27T07:25:03.622636Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7x4v8 kind= uid=2dd4ebdb-916e-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:25:03.623631Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7x4v8 kind= uid=2dd4ebdb-916e-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:25:03.658653Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi7x4v8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi7x4v8" level=info timestamp=2018-07-27T07:25:03.905574Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6xtmv kind= uid=2dff8f24-916e-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:25:03.906205Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6xtmv kind= uid=2dff8f24-916e-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2qczc Pod phase: Running level=info timestamp=2018-07-27T07:18:51.035563Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:18:51.036126Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T07:18:51.036234Z pos=vm.go:678 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-27T07:18:51.036289Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-27T07:18:51.036312Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-27T07:18:51.038842Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T07:18:51.054584Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:18:51.054691Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:18:51.054770Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:18:51.643245Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:18:51.643426Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:18:51.643541Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:18:51.643616Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:19:01.767736Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:19:01.767918Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-n5fk6 Pod phase: Running level=info timestamp=2018-07-27T06:26:59.360465Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-07-27T06:26:59.368448Z pos=vm.go:210 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-07-27T06:26:59.370331Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-07-27T06:26:59.469715Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T06:26:59.617851Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T06:26:59.630449Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-launcher-testvmi6xtmv-jh2j7 Pod phase: Running level=info timestamp=2018-07-27T07:25:07.406090Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T07:25:07.406373Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T07:25:07.409803Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T07:25:17.417422Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T07:25:17.464413Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi6xtmv" level=info timestamp=2018-07-27T07:25:17.466680Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T07:25:17.466853Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure [180.474 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 with boot order /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:170 should be able to boot from selected disk /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 Alpine as first boot [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 90.010s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1049 ------------------------------ STEP: defining a VirtualMachineInstance with an Alpine disk STEP: adding a Cirros Disk STEP: setting boot order STEP: starting VirtualMachineInstance STEP: Waiting the VirtualMachineInstance start level=info timestamp=2018-07-27T07:25:04.041734Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmi6xtmv-jh2j7" Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running 2018/07/27 07:29:58 http: TLS handshake error from 10.244.0.1:38670: EOF level=info timestamp=2018-07-27T07:29:58.879210Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:30:00.375840Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:30:08 http: TLS handshake error from 10.244.0.1:38694: EOF level=info timestamp=2018-07-27T07:30:10.266590Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-27T07:30:10.267769Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:30:18 http: TLS handshake error from 10.244.0.1:38718: EOF 2018/07/27 07:30:28 http: TLS handshake error from 10.244.0.1:38742: EOF level=info timestamp=2018-07-27T07:30:28.907959Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:30:30.405946Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:30:38 http: TLS handshake error from 10.244.0.1:38766: EOF 2018/07/27 07:30:48 http: TLS handshake error from 10.244.0.1:38790: EOF 2018/07/27 07:30:58 http: TLS handshake error from 10.244.0.1:38814: EOF level=info timestamp=2018-07-27T07:30:58.934107Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:31:00.433204Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-zplcq Pod phase: Running 2018/07/27 07:29:09 http: TLS handshake error from 10.244.1.1:48874: EOF 2018/07/27 07:29:19 http: TLS handshake error from 10.244.1.1:48880: EOF 2018/07/27 07:29:29 http: TLS handshake error from 10.244.1.1:48886: EOF level=info timestamp=2018-07-27T07:29:34.986824Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:29:39 http: TLS handshake error from 10.244.1.1:48892: EOF 2018/07/27 07:29:49 http: TLS handshake error from 10.244.1.1:48898: EOF 2018/07/27 07:29:59 http: TLS handshake error from 10.244.1.1:48904: EOF level=info timestamp=2018-07-27T07:30:04.963953Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:30:09 http: TLS handshake error from 10.244.1.1:48910: EOF 2018/07/27 07:30:19 http: TLS handshake error from 10.244.1.1:48916: EOF 2018/07/27 07:30:29 http: TLS handshake error from 10.244.1.1:48922: EOF level=info timestamp=2018-07-27T07:30:34.989039Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:30:39 http: TLS handshake error from 10.244.1.1:48928: EOF 2018/07/27 07:30:49 http: TLS handshake error from 10.244.1.1:48934: EOF 2018/07/27 07:30:59 http: TLS handshake error from 10.244.1.1:48940: EOF Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T07:22:02.753940Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidrzjm kind= uid=c2063d91-916d-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:22:02.754071Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidrzjm kind= uid=c2063d91-916d-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:25:03.037192Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidrzjm\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmidrzjm, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: c2063d91-916d-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidrzjm" level=info timestamp=2018-07-27T07:25:03.365168Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6nsbn kind= uid=2dad6b7e-916e-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:25:03.365337Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6nsbn kind= uid=2dad6b7e-916e-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:25:03.416335Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6nsbn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6nsbn" level=info timestamp=2018-07-27T07:25:03.622636Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7x4v8 kind= uid=2dd4ebdb-916e-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:25:03.623631Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7x4v8 kind= uid=2dd4ebdb-916e-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:25:03.658653Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi7x4v8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi7x4v8" level=info timestamp=2018-07-27T07:25:03.905574Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6xtmv kind= uid=2dff8f24-916e-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:25:03.906205Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6xtmv kind= uid=2dff8f24-916e-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:28:04.199342Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6xtmv\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi6xtmv, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 2dff8f24-916e-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6xtmv" level=info timestamp=2018-07-27T07:28:04.373902Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitlfxx kind= uid=999136ed-916e-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:28:04.374049Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitlfxx kind= uid=999136ed-916e-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:28:04.450132Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmitlfxx\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmitlfxx" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2qczc Pod phase: Running level=info timestamp=2018-07-27T07:18:51.035563Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:18:51.036126Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T07:18:51.036234Z pos=vm.go:678 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-27T07:18:51.036289Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-27T07:18:51.036312Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-27T07:18:51.038842Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T07:18:51.054584Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:18:51.054691Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:18:51.054770Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:18:51.643245Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:18:51.643426Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:18:51.643541Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:18:51.643616Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind= uid=329c585d-916d-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:19:01.767736Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:19:01.767918Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiwqxtm kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-n5fk6 Pod phase: Running level=info timestamp=2018-07-27T06:26:59.360465Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-07-27T06:26:59.368448Z pos=vm.go:210 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-07-27T06:26:59.370331Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-07-27T06:26:59.469715Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T06:26:59.617851Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T06:26:59.630449Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-launcher-testvmitlfxx-54ggm Pod phase: Running level=info timestamp=2018-07-27T07:28:08.524884Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T07:28:08.525174Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T07:28:08.527312Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T07:28:18.532927Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T07:28:18.568505Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmitlfxx" level=info timestamp=2018-07-27T07:28:18.570260Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T07:28:18.570408Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.459 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 with boot order /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:170 should be able to boot from selected disk /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 Cirros as first boot [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1049 ------------------------------ STEP: defining a VirtualMachineInstance with an Alpine disk STEP: adding a Cirros Disk STEP: setting boot order STEP: starting VirtualMachineInstance STEP: Waiting the VirtualMachineInstance start level=info timestamp=2018-07-27T07:28:04.411298Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmitlfxx-54ggm" • [SLOW TEST:15.773 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 with user-data /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:201 without k8s secret /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:202 should retry starting the VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:203 ------------------------------ • [SLOW TEST:15.363 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 with user-data /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:201 without k8s secret /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:202 should log warning and proceed once the secret is there /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:233 ------------------------------ Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running 2018/07/27 07:33:28 http: TLS handshake error from 10.244.0.1:39174: EOF level=info timestamp=2018-07-27T07:33:29.096326Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:33:30.580348Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:33:38 http: TLS handshake error from 10.244.0.1:39198: EOF 2018/07/27 07:33:48 http: TLS handshake error from 10.244.0.1:39222: EOF 2018/07/27 07:33:58 http: TLS handshake error from 10.244.0.1:39246: EOF level=info timestamp=2018-07-27T07:33:59.126049Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:34:00.604438Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:34:08 http: TLS handshake error from 10.244.0.1:39270: EOF level=info timestamp=2018-07-27T07:34:10.217469Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-27T07:34:10.218367Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:34:18 http: TLS handshake error from 10.244.0.1:39294: EOF 2018/07/27 07:34:28 http: TLS handshake error from 10.244.0.1:39318: EOF level=info timestamp=2018-07-27T07:34:29.156739Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:34:30.632776Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-zplcq Pod phase: Running 2018/07/27 07:32:49 http: TLS handshake error from 10.244.1.1:49006: EOF 2018/07/27 07:32:59 http: TLS handshake error from 10.244.1.1:49012: EOF level=info timestamp=2018-07-27T07:33:04.999974Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:33:09 http: TLS handshake error from 10.244.1.1:49018: EOF 2018/07/27 07:33:19 http: TLS handshake error from 10.244.1.1:49024: EOF 2018/07/27 07:33:29 http: TLS handshake error from 10.244.1.1:49030: EOF level=info timestamp=2018-07-27T07:33:34.991521Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:33:39 http: TLS handshake error from 10.244.1.1:49036: EOF 2018/07/27 07:33:49 http: TLS handshake error from 10.244.1.1:49042: EOF 2018/07/27 07:33:59 http: TLS handshake error from 10.244.1.1:49048: EOF level=info timestamp=2018-07-27T07:34:05.003797Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:34:09 http: TLS handshake error from 10.244.1.1:49054: EOF 2018/07/27 07:34:19 http: TLS handshake error from 10.244.1.1:49060: EOF 2018/07/27 07:34:29 http: TLS handshake error from 10.244.1.1:49066: EOF level=info timestamp=2018-07-27T07:34:34.984027Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T07:25:03.658653Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi7x4v8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi7x4v8" level=info timestamp=2018-07-27T07:25:03.905574Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6xtmv kind= uid=2dff8f24-916e-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:25:03.906205Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6xtmv kind= uid=2dff8f24-916e-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:28:04.199342Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6xtmv\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi6xtmv, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 2dff8f24-916e-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6xtmv" level=info timestamp=2018-07-27T07:28:04.373902Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitlfxx kind= uid=999136ed-916e-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:28:04.374049Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitlfxx kind= uid=999136ed-916e-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:28:04.450132Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmitlfxx\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmitlfxx" level=info timestamp=2018-07-27T07:31:04.840003Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmis44r9 kind= uid=0521188a-916f-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:31:04.840629Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmis44r9 kind= uid=0521188a-916f-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:31:20.604163Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv7fkx kind= uid=0e87e6d0-916f-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:31:20.604285Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv7fkx kind= uid=0e87e6d0-916f-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:31:20.667979Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiv7fkx\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiv7fkx" level=info timestamp=2018-07-27T07:31:35.971265Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4x8p6 kind= uid=17b0647b-916f-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:31:35.971434Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4x8p6 kind= uid=17b0647b-916f-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:31:36.032200Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4x8p6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4x8p6" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2qczc Pod phase: Running level=info timestamp=2018-07-27T07:31:35.700368Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind= uid=0e87e6d0-916f-11e8-86a4-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-27T07:31:35.700449Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind= uid=0e87e6d0-916f-11e8-86a4-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-27T07:31:35.702385Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind= uid=0e87e6d0-916f-11e8-86a4-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmiv7fkx" level=info timestamp=2018-07-27T07:31:35.912235Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind= uid=0e87e6d0-916f-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:31:35.912349Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-27T07:31:35.912378Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-27T07:31:35.912572Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:31:35.914360Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-27T07:31:35.914398Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-27T07:31:35.914511Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:31:35.915256Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T07:31:35.915405Z pos=vm.go:678 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-27T07:31:35.915470Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:31:35.915555Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:31:35.917405Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" Pod name: virt-handler-n5fk6 Pod phase: Running level=info timestamp=2018-07-27T06:26:59.360465Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-07-27T06:26:59.368448Z pos=vm.go:210 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-07-27T06:26:59.370331Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-07-27T06:26:59.469715Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T06:26:59.617851Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T06:26:59.630449Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-launcher-testvmi4x8p6-5jjlk Pod phase: Running level=info timestamp=2018-07-27T07:31:39.149166Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T07:31:39.149455Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T07:31:39.151000Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T07:31:49.164275Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T07:31:49.187266Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi4x8p6" level=info timestamp=2018-07-27T07:31:49.189142Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T07:31:49.189320Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.474 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 when virt-launcher crashes /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:281 should be stopped and have Failed phase [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:282 Timed out after 90.005s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1049 ------------------------------ level=info timestamp=2018-07-27T07:31:36.104726Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmi4x8p6-5jjlk" Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running 2018/07/27 07:36:28 http: TLS handshake error from 10.244.0.1:39606: EOF level=info timestamp=2018-07-27T07:36:29.272965Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:36:30.749055Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:36:38 http: TLS handshake error from 10.244.0.1:39630: EOF 2018/07/27 07:36:48 http: TLS handshake error from 10.244.0.1:39654: EOF 2018/07/27 07:36:58 http: TLS handshake error from 10.244.0.1:39678: EOF level=info timestamp=2018-07-27T07:36:59.301993Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:37:00.775892Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:37:08 http: TLS handshake error from 10.244.0.1:39702: EOF level=info timestamp=2018-07-27T07:37:10.222189Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-27T07:37:10.222976Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:37:18 http: TLS handshake error from 10.244.0.1:39726: EOF 2018/07/27 07:37:28 http: TLS handshake error from 10.244.0.1:39750: EOF level=info timestamp=2018-07-27T07:37:29.330220Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:37:30.807063Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-zplcq Pod phase: Running 2018/07/27 07:35:49 http: TLS handshake error from 10.244.1.1:49114: EOF 2018/07/27 07:35:59 http: TLS handshake error from 10.244.1.1:49120: EOF level=info timestamp=2018-07-27T07:36:04.997689Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:36:09 http: TLS handshake error from 10.244.1.1:49126: EOF 2018/07/27 07:36:19 http: TLS handshake error from 10.244.1.1:49132: EOF 2018/07/27 07:36:29 http: TLS handshake error from 10.244.1.1:49138: EOF level=info timestamp=2018-07-27T07:36:34.978725Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:36:39 http: TLS handshake error from 10.244.1.1:49144: EOF 2018/07/27 07:36:49 http: TLS handshake error from 10.244.1.1:49150: EOF 2018/07/27 07:36:59 http: TLS handshake error from 10.244.1.1:49156: EOF level=info timestamp=2018-07-27T07:37:04.966861Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:37:09 http: TLS handshake error from 10.244.1.1:49162: EOF 2018/07/27 07:37:19 http: TLS handshake error from 10.244.1.1:49168: EOF 2018/07/27 07:37:29 http: TLS handshake error from 10.244.1.1:49174: EOF level=info timestamp=2018-07-27T07:37:34.978186Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T07:28:04.199342Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6xtmv\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi6xtmv, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 2dff8f24-916e-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6xtmv" level=info timestamp=2018-07-27T07:28:04.373902Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitlfxx kind= uid=999136ed-916e-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:28:04.374049Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitlfxx kind= uid=999136ed-916e-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:28:04.450132Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmitlfxx\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmitlfxx" level=info timestamp=2018-07-27T07:31:04.840003Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmis44r9 kind= uid=0521188a-916f-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:31:04.840629Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmis44r9 kind= uid=0521188a-916f-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:31:20.604163Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv7fkx kind= uid=0e87e6d0-916f-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:31:20.604285Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv7fkx kind= uid=0e87e6d0-916f-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:31:20.667979Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiv7fkx\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiv7fkx" level=info timestamp=2018-07-27T07:31:35.971265Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4x8p6 kind= uid=17b0647b-916f-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:31:35.971434Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4x8p6 kind= uid=17b0647b-916f-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:31:36.032200Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4x8p6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4x8p6" level=info timestamp=2018-07-27T07:34:36.449093Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwbgxn kind= uid=8342db3b-916f-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:34:36.449222Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwbgxn kind= uid=8342db3b-916f-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:34:36.563249Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwbgxn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwbgxn" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2qczc Pod phase: Running level=info timestamp=2018-07-27T07:31:35.700368Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind= uid=0e87e6d0-916f-11e8-86a4-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-27T07:31:35.700449Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind= uid=0e87e6d0-916f-11e8-86a4-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-27T07:31:35.702385Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind= uid=0e87e6d0-916f-11e8-86a4-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmiv7fkx" level=info timestamp=2018-07-27T07:31:35.912235Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind= uid=0e87e6d0-916f-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:31:35.912349Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-27T07:31:35.912378Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-27T07:31:35.912572Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:31:35.914360Z pos=vm.go:330 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-07-27T07:31:35.914398Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=VirtualMachineInstance uid= msg="Processing shutdown." level=info timestamp=2018-07-27T07:31:35.914511Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:31:35.915256Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T07:31:35.915405Z pos=vm.go:678 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-27T07:31:35.915470Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:31:35.915555Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiv7fkx kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:31:35.917405Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" Pod name: virt-handler-n5fk6 Pod phase: Running level=info timestamp=2018-07-27T06:26:59.360465Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-07-27T06:26:59.368448Z pos=vm.go:210 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-07-27T06:26:59.370331Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-07-27T06:26:59.469715Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T06:26:59.617851Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T06:26:59.630449Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-launcher-testvmiwbgxn-fxvzv Pod phase: Running level=info timestamp=2018-07-27T07:34:39.672464Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T07:34:39.673266Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T07:34:39.675483Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T07:34:49.680662Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T07:34:49.721116Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiwbgxn" level=info timestamp=2018-07-27T07:34:49.722715Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T07:34:49.722849Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.476 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 when virt-handler crashes /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:304 should recover and continue management [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:305 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1049 ------------------------------ level=info timestamp=2018-07-27T07:34:36.549713Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmiwbgxn-fxvzv" • [SLOW TEST:10.279 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 when virt-handler is responsive /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:335 should indicate that a node is ready for vmis /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:336 ------------------------------ • [SLOW TEST:91.477 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 when virt-handler is not responsive /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:366 the node controller should react /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:405 ------------------------------ • [SLOW TEST:17.646 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 with node tainted /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:458 the vmi with tolerations should be scheduled /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:480 ------------------------------ • ------------------------------ S [SKIPPING] [0.226 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 with non default namespace /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:530 should log libvirt start and stop lifecycle events of the domain /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 kubevirt-test-default [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Skip log query tests for JENKINS ci test environment /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:535 ------------------------------ S [SKIPPING] [0.058 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 with non default namespace /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:530 should log libvirt start and stop lifecycle events of the domain /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 kubevirt-test-alternative [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Skip log query tests for JENKINS ci test environment /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:535 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.070 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 VirtualMachineInstance Emulation Mode /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:591 should enable emulation in virt-launcher [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:603 Software emulation is not enabled on this cluster /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:599 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.069 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 VirtualMachineInstance Emulation Mode /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:591 should be reflected in domain XML [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:640 Software emulation is not enabled on this cluster /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:599 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.052 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:70 VirtualMachineInstance Emulation Mode /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:591 should request a TUN device but not KVM [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:684 Software emulation is not enabled on this cluster /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:599 ------------------------------ •••• Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running level=info timestamp=2018-07-27T07:41:29.575439Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:41:31.059517Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:41:38 http: TLS handshake error from 10.244.0.1:40354: EOF 2018/07/27 07:41:48 http: TLS handshake error from 10.244.0.1:40378: EOF 2018/07/27 07:41:58 http: TLS handshake error from 10.244.0.1:40402: EOF level=info timestamp=2018-07-27T07:41:59.599851Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:42:01.094528Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:42:08 http: TLS handshake error from 10.244.0.1:40426: EOF level=info timestamp=2018-07-27T07:42:10.726830Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-27T07:42:10.727575Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:42:18 http: TLS handshake error from 10.244.0.1:40450: EOF 2018/07/27 07:42:28 http: TLS handshake error from 10.244.0.1:40474: EOF level=info timestamp=2018-07-27T07:42:29.628597Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:42:31.121418Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:42:38 http: TLS handshake error from 10.244.0.1:40498: EOF Pod name: virt-api-7d79764579-zplcq Pod phase: Running 2018/07/27 07:40:49 http: TLS handshake error from 10.244.1.1:49296: EOF 2018/07/27 07:40:59 http: TLS handshake error from 10.244.1.1:49302: EOF level=info timestamp=2018-07-27T07:41:04.993134Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:41:09 http: TLS handshake error from 10.244.1.1:49308: EOF 2018/07/27 07:41:19 http: TLS handshake error from 10.244.1.1:49314: EOF 2018/07/27 07:41:29 http: TLS handshake error from 10.244.1.1:49320: EOF level=info timestamp=2018-07-27T07:41:34.999046Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:41:39 http: TLS handshake error from 10.244.1.1:49326: EOF 2018/07/27 07:41:49 http: TLS handshake error from 10.244.1.1:49332: EOF 2018/07/27 07:41:59 http: TLS handshake error from 10.244.1.1:49338: EOF level=info timestamp=2018-07-27T07:42:04.969868Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:42:09 http: TLS handshake error from 10.244.1.1:49344: EOF 2018/07/27 07:42:19 http: TLS handshake error from 10.244.1.1:49350: EOF 2018/07/27 07:42:29 http: TLS handshake error from 10.244.1.1:49356: EOF level=info timestamp=2018-07-27T07:42:34.998810Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T07:39:18.737162Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmirpl2r\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmirpl2r" level=info timestamp=2018-07-27T07:39:36.353134Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihdfns kind= uid=36046e93-9170-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:39:36.353945Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihdfns kind= uid=36046e93-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:39:36.409325Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmihdfns\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmihdfns" level=info timestamp=2018-07-27T07:39:36.418375Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmihdfns\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmihdfns" level=info timestamp=2018-07-27T07:39:37.496467Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmihdfns\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmihdfns, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 36046e93-9170-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmihdfns" level=info timestamp=2018-07-27T07:39:37.972028Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pz6s kind= uid=36fc3548-9170-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:39:37.972171Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pz6s kind= uid=36fc3548-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:39:38.239108Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvminkvwj kind= uid=37247ada-9170-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:39:38.239211Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvminkvwj kind= uid=37247ada-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:39:38.329824Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvminkvwj\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvminkvwj, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 37247ada-9170-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvminkvwj" level=info timestamp=2018-07-27T07:39:38.674724Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivfqd7 kind= uid=37670322-9170-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:39:38.674841Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivfqd7 kind= uid=37670322-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:39:38.737495Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivfqd7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivfqd7" level=info timestamp=2018-07-27T07:39:38.765302Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivfqd7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivfqd7" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-k7vk7 Pod phase: Running level=info timestamp=2018-07-27T07:39:36.024363Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T07:39:36.030922Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.192557Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-27T07:39:36.192645Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-27T07:39:36.235523Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmirpl2r" level=info timestamp=2018-07-27T07:39:36.446983Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T07:39:36.447100Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.447261Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:36.447242Z pos=vm.go:678 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-27T07:39:36.447320Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.447362Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:36.447596Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.448951Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:36.449051Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.451369Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" Pod name: virt-handler-ttfvq Pod phase: Running level=info timestamp=2018-07-27T07:39:01.740230Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmiv4ntsphlqc_sock" level=error timestamp=2018-07-27T07:39:01.740273Z pos=cache.go:124 component=virt-handler reason="dial unix /var/run/kubevirt/sockets/kubevirt-test-default_testvmiv4ntsphlqc_sock: connect: connection refused" msg="failed to connect to cmd client socket" level=info timestamp=2018-07-27T07:39:01.740298Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmiwbgxn_sock" level=error timestamp=2018-07-27T07:39:01.740339Z pos=cache.go:124 component=virt-handler reason="dial unix /var/run/kubevirt/sockets/kubevirt-test-default_testvmiwbgxn_sock: connect: connection refused" msg="failed to connect to cmd client socket" level=info timestamp=2018-07-27T07:39:01.740364Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmiz4lsx_sock" level=error timestamp=2018-07-27T07:39:01.740406Z pos=cache.go:124 component=virt-handler reason="dial unix /var/run/kubevirt/sockets/kubevirt-test-default_testvmiz4lsx_sock: connect: connection refused" msg="failed to connect to cmd client socket" level=info timestamp=2018-07-27T07:39:01.836231Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T07:39:01.845527Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" level=info timestamp=2018-07-27T07:39:01.847594Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T07:39:01.936725Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid=f4db897a-916f-11e8-86a4-525500d15501 msg="No update processing required" level=info timestamp=2018-07-27T07:39:01.936966Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid=f4db897a-916f-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:18.418328Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:18.418466Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:18.418598Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:18.418719Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmivfqd7-7ptld Pod phase: Running level=info timestamp=2018-07-27T07:39:42.226186Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T07:39:42.226990Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T07:39:42.229073Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T07:39:52.234995Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T07:39:52.255697Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivfqd7" level=info timestamp=2018-07-27T07:39:52.257445Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T07:39:52.257589Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure [180.328 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Delete a VirtualMachineInstance's Pod /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:836 should result in the VirtualMachineInstance moving to a finalized state [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:837 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1049 ------------------------------ STEP: Creating the VirtualMachineInstance level=info timestamp=2018-07-27T07:39:38.776934Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmivfqd7-7ptld" Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running level=info timestamp=2018-07-27T07:44:29.745374Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:44:31.239943Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:44:38 http: TLS handshake error from 10.244.0.1:40786: EOF 2018/07/27 07:44:48 http: TLS handshake error from 10.244.0.1:40810: EOF 2018/07/27 07:44:58 http: TLS handshake error from 10.244.0.1:40834: EOF level=info timestamp=2018-07-27T07:44:59.772213Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:45:01.267042Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:45:08 http: TLS handshake error from 10.244.0.1:40858: EOF level=info timestamp=2018-07-27T07:45:10.266893Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-27T07:45:10.268171Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:45:18 http: TLS handshake error from 10.244.0.1:40882: EOF 2018/07/27 07:45:28 http: TLS handshake error from 10.244.0.1:40906: EOF level=info timestamp=2018-07-27T07:45:29.814789Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:45:31.303835Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:45:38 http: TLS handshake error from 10.244.0.1:40930: EOF Pod name: virt-api-7d79764579-zplcq Pod phase: Running 2018/07/27 07:43:49 http: TLS handshake error from 10.244.1.1:49404: EOF 2018/07/27 07:43:59 http: TLS handshake error from 10.244.1.1:49410: EOF level=info timestamp=2018-07-27T07:44:04.992176Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:44:09 http: TLS handshake error from 10.244.1.1:49416: EOF 2018/07/27 07:44:19 http: TLS handshake error from 10.244.1.1:49422: EOF 2018/07/27 07:44:29 http: TLS handshake error from 10.244.1.1:49428: EOF level=info timestamp=2018-07-27T07:44:34.976516Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:44:39 http: TLS handshake error from 10.244.1.1:49434: EOF 2018/07/27 07:44:49 http: TLS handshake error from 10.244.1.1:49440: EOF 2018/07/27 07:44:59 http: TLS handshake error from 10.244.1.1:49446: EOF level=info timestamp=2018-07-27T07:45:05.004257Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:45:09 http: TLS handshake error from 10.244.1.1:49452: EOF 2018/07/27 07:45:19 http: TLS handshake error from 10.244.1.1:49458: EOF 2018/07/27 07:45:29 http: TLS handshake error from 10.244.1.1:49464: EOF level=info timestamp=2018-07-27T07:45:34.990101Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T07:39:36.353945Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihdfns kind= uid=36046e93-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:39:36.409325Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmihdfns\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmihdfns" level=info timestamp=2018-07-27T07:39:36.418375Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmihdfns\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmihdfns" level=info timestamp=2018-07-27T07:39:37.496467Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmihdfns\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmihdfns, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 36046e93-9170-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmihdfns" level=info timestamp=2018-07-27T07:39:37.972028Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pz6s kind= uid=36fc3548-9170-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:39:37.972171Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pz6s kind= uid=36fc3548-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:39:38.239108Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvminkvwj kind= uid=37247ada-9170-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:39:38.239211Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvminkvwj kind= uid=37247ada-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:39:38.329824Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvminkvwj\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvminkvwj, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 37247ada-9170-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvminkvwj" level=info timestamp=2018-07-27T07:39:38.674724Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivfqd7 kind= uid=37670322-9170-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:39:38.674841Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivfqd7 kind= uid=37670322-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:39:38.737495Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivfqd7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivfqd7" level=info timestamp=2018-07-27T07:39:38.765302Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivfqd7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivfqd7" level=info timestamp=2018-07-27T07:42:39.156065Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizw8jg kind= uid=a2fa4a5c-9170-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:42:39.157907Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizw8jg kind= uid=a2fa4a5c-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-k7vk7 Pod phase: Running level=info timestamp=2018-07-27T07:39:36.024363Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T07:39:36.030922Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.192557Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-27T07:39:36.192645Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-27T07:39:36.235523Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmirpl2r" level=info timestamp=2018-07-27T07:39:36.446983Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T07:39:36.447100Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.447261Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:36.447242Z pos=vm.go:678 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-27T07:39:36.447320Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.447362Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:36.447596Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.448951Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:36.449051Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.451369Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" Pod name: virt-handler-ttfvq Pod phase: Running level=info timestamp=2018-07-27T07:39:01.740230Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmiv4ntsphlqc_sock" level=error timestamp=2018-07-27T07:39:01.740273Z pos=cache.go:124 component=virt-handler reason="dial unix /var/run/kubevirt/sockets/kubevirt-test-default_testvmiv4ntsphlqc_sock: connect: connection refused" msg="failed to connect to cmd client socket" level=info timestamp=2018-07-27T07:39:01.740298Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmiwbgxn_sock" level=error timestamp=2018-07-27T07:39:01.740339Z pos=cache.go:124 component=virt-handler reason="dial unix /var/run/kubevirt/sockets/kubevirt-test-default_testvmiwbgxn_sock: connect: connection refused" msg="failed to connect to cmd client socket" level=info timestamp=2018-07-27T07:39:01.740364Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmiz4lsx_sock" level=error timestamp=2018-07-27T07:39:01.740406Z pos=cache.go:124 component=virt-handler reason="dial unix /var/run/kubevirt/sockets/kubevirt-test-default_testvmiz4lsx_sock: connect: connection refused" msg="failed to connect to cmd client socket" level=info timestamp=2018-07-27T07:39:01.836231Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T07:39:01.845527Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" level=info timestamp=2018-07-27T07:39:01.847594Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T07:39:01.936725Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid=f4db897a-916f-11e8-86a4-525500d15501 msg="No update processing required" level=info timestamp=2018-07-27T07:39:01.936966Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid=f4db897a-916f-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:18.418328Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:18.418466Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:18.418598Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:18.418719Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmizw8jg-zd9pm Pod phase: Running level=info timestamp=2018-07-27T07:42:42.728126Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T07:42:42.728978Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T07:42:42.731290Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T07:42:52.769124Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T07:42:52.789161Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmizw8jg" level=info timestamp=2018-07-27T07:42:52.790940Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T07:42:52.791099Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.459 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Delete a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:868 with an active pod. /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:869 should result in pod being terminated [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:870 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1049 ------------------------------ STEP: Creating the VirtualMachineInstance level=info timestamp=2018-07-27T07:42:39.177636Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmizw8jg-zd9pm" Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running 2018/07/27 07:47:18 http: TLS handshake error from 10.244.0.1:41170: EOF 2018/07/27 07:47:28 http: TLS handshake error from 10.244.0.1:41194: EOF level=info timestamp=2018-07-27T07:47:29.942071Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:47:31.421028Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:47:38 http: TLS handshake error from 10.244.0.1:41218: EOF 2018/07/27 07:47:48 http: TLS handshake error from 10.244.0.1:41242: EOF 2018/07/27 07:47:58 http: TLS handshake error from 10.244.0.1:41266: EOF level=info timestamp=2018-07-27T07:47:59.969385Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:48:01.450880Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:48:08 http: TLS handshake error from 10.244.0.1:41290: EOF 2018/07/27 07:48:18 http: TLS handshake error from 10.244.0.1:41314: EOF 2018/07/27 07:48:28 http: TLS handshake error from 10.244.0.1:41338: EOF level=info timestamp=2018-07-27T07:48:29.998265Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:48:31.478849Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:48:38 http: TLS handshake error from 10.244.0.1:41362: EOF Pod name: virt-api-7d79764579-zplcq Pod phase: Running 2018/07/27 07:46:59 http: TLS handshake error from 10.244.1.1:49518: EOF level=info timestamp=2018-07-27T07:47:04.997330Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:47:09 http: TLS handshake error from 10.244.1.1:49524: EOF 2018/07/27 07:47:19 http: TLS handshake error from 10.244.1.1:49530: EOF 2018/07/27 07:47:29 http: TLS handshake error from 10.244.1.1:49536: EOF level=info timestamp=2018-07-27T07:47:35.005754Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:47:39 http: TLS handshake error from 10.244.1.1:49542: EOF 2018/07/27 07:47:49 http: TLS handshake error from 10.244.1.1:49548: EOF 2018/07/27 07:47:59 http: TLS handshake error from 10.244.1.1:49554: EOF level=info timestamp=2018-07-27T07:48:04.992533Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:48:09 http: TLS handshake error from 10.244.1.1:49560: EOF 2018/07/27 07:48:19 http: TLS handshake error from 10.244.1.1:49566: EOF 2018/07/27 07:48:29 http: TLS handshake error from 10.244.1.1:49572: EOF level=info timestamp=2018-07-27T07:48:35.008532Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:48:39 http: TLS handshake error from 10.244.1.1:49578: EOF Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T07:39:37.496467Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmihdfns\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmihdfns, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 36046e93-9170-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmihdfns" level=info timestamp=2018-07-27T07:39:37.972028Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pz6s kind= uid=36fc3548-9170-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:39:37.972171Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pz6s kind= uid=36fc3548-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:39:38.239108Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvminkvwj kind= uid=37247ada-9170-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:39:38.239211Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvminkvwj kind= uid=37247ada-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:39:38.329824Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvminkvwj\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvminkvwj, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 37247ada-9170-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvminkvwj" level=info timestamp=2018-07-27T07:39:38.674724Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivfqd7 kind= uid=37670322-9170-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:39:38.674841Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivfqd7 kind= uid=37670322-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:39:38.737495Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivfqd7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivfqd7" level=info timestamp=2018-07-27T07:39:38.765302Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivfqd7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivfqd7" level=info timestamp=2018-07-27T07:42:39.156065Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizw8jg kind= uid=a2fa4a5c-9170-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:42:39.157907Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizw8jg kind= uid=a2fa4a5c-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:45:39.435501Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizw8jg\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmizw8jg, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a2fa4a5c-9170-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizw8jg" level=info timestamp=2018-07-27T07:45:39.627387Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wjgw kind= uid=0e8bb636-9171-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:45:39.627526Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wjgw kind= uid=0e8bb636-9171-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-k7vk7 Pod phase: Running level=info timestamp=2018-07-27T07:39:36.024363Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T07:39:36.030922Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.192557Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-27T07:39:36.192645Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-27T07:39:36.235523Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmirpl2r" level=info timestamp=2018-07-27T07:39:36.446983Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T07:39:36.447100Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.447261Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:36.447242Z pos=vm.go:678 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-27T07:39:36.447320Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.447362Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:36.447596Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.448951Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:36.449051Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.451369Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" Pod name: virt-handler-ttfvq Pod phase: Running level=info timestamp=2018-07-27T07:39:01.740230Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmiv4ntsphlqc_sock" level=error timestamp=2018-07-27T07:39:01.740273Z pos=cache.go:124 component=virt-handler reason="dial unix /var/run/kubevirt/sockets/kubevirt-test-default_testvmiv4ntsphlqc_sock: connect: connection refused" msg="failed to connect to cmd client socket" level=info timestamp=2018-07-27T07:39:01.740298Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmiwbgxn_sock" level=error timestamp=2018-07-27T07:39:01.740339Z pos=cache.go:124 component=virt-handler reason="dial unix /var/run/kubevirt/sockets/kubevirt-test-default_testvmiwbgxn_sock: connect: connection refused" msg="failed to connect to cmd client socket" level=info timestamp=2018-07-27T07:39:01.740364Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmiz4lsx_sock" level=error timestamp=2018-07-27T07:39:01.740406Z pos=cache.go:124 component=virt-handler reason="dial unix /var/run/kubevirt/sockets/kubevirt-test-default_testvmiz4lsx_sock: connect: connection refused" msg="failed to connect to cmd client socket" level=info timestamp=2018-07-27T07:39:01.836231Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T07:39:01.845527Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" level=info timestamp=2018-07-27T07:39:01.847594Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T07:39:01.936725Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid=f4db897a-916f-11e8-86a4-525500d15501 msg="No update processing required" level=info timestamp=2018-07-27T07:39:01.936966Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid=f4db897a-916f-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:18.418328Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:18.418466Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:18.418598Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:18.418719Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi4wjgw-hqc5z Pod phase: Running level=info timestamp=2018-07-27T07:45:43.702433Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T07:45:43.703112Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T07:45:43.704572Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T07:45:53.711129Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T07:45:53.763355Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi4wjgw" level=info timestamp=2018-07-27T07:45:53.764938Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T07:45:53.765184Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.484 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Delete a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:868 with grace period greater than 0 /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:893 should run graceful shutdown [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:894 Timed out after 90.006s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1049 ------------------------------ STEP: Setting a VirtualMachineInstance termination grace period to 5 STEP: Creating the VirtualMachineInstance level=info timestamp=2018-07-27T07:45:39.686243Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmi4wjgw-hqc5z" Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running 2018/07/27 07:50:18 http: TLS handshake error from 10.244.0.1:41602: EOF 2018/07/27 07:50:28 http: TLS handshake error from 10.244.0.1:41626: EOF level=info timestamp=2018-07-27T07:50:30.229542Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:50:31.650442Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:50:38 http: TLS handshake error from 10.244.0.1:41650: EOF 2018/07/27 07:50:48 http: TLS handshake error from 10.244.0.1:41674: EOF 2018/07/27 07:50:58 http: TLS handshake error from 10.244.0.1:41698: EOF level=info timestamp=2018-07-27T07:51:00.252510Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:51:01.671589Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:51:08 http: TLS handshake error from 10.244.0.1:41722: EOF 2018/07/27 07:51:18 http: TLS handshake error from 10.244.0.1:41746: EOF 2018/07/27 07:51:28 http: TLS handshake error from 10.244.0.1:41770: EOF level=info timestamp=2018-07-27T07:51:30.275579Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:51:31.709267Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:51:38 http: TLS handshake error from 10.244.0.1:41794: EOF Pod name: virt-api-7d79764579-zplcq Pod phase: Running 2018/07/27 07:49:59 http: TLS handshake error from 10.244.1.1:49626: EOF level=info timestamp=2018-07-27T07:50:04.973981Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:50:09 http: TLS handshake error from 10.244.1.1:49632: EOF 2018/07/27 07:50:19 http: TLS handshake error from 10.244.1.1:49638: EOF 2018/07/27 07:50:29 http: TLS handshake error from 10.244.1.1:49644: EOF level=info timestamp=2018-07-27T07:50:34.990552Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:50:39 http: TLS handshake error from 10.244.1.1:49650: EOF 2018/07/27 07:50:49 http: TLS handshake error from 10.244.1.1:49656: EOF 2018/07/27 07:50:59 http: TLS handshake error from 10.244.1.1:49662: EOF level=info timestamp=2018-07-27T07:51:05.008927Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:51:09 http: TLS handshake error from 10.244.1.1:49668: EOF 2018/07/27 07:51:19 http: TLS handshake error from 10.244.1.1:49674: EOF 2018/07/27 07:51:29 http: TLS handshake error from 10.244.1.1:49680: EOF level=info timestamp=2018-07-27T07:51:35.002180Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:51:39 http: TLS handshake error from 10.244.1.1:49686: EOF Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T07:39:38.239108Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvminkvwj kind= uid=37247ada-9170-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:39:38.239211Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvminkvwj kind= uid=37247ada-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:39:38.329824Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvminkvwj\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvminkvwj, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 37247ada-9170-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvminkvwj" level=info timestamp=2018-07-27T07:39:38.674724Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivfqd7 kind= uid=37670322-9170-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:39:38.674841Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivfqd7 kind= uid=37670322-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:39:38.737495Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivfqd7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivfqd7" level=info timestamp=2018-07-27T07:39:38.765302Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivfqd7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivfqd7" level=info timestamp=2018-07-27T07:42:39.156065Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizw8jg kind= uid=a2fa4a5c-9170-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:42:39.157907Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizw8jg kind= uid=a2fa4a5c-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:45:39.435501Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizw8jg\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmizw8jg, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a2fa4a5c-9170-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizw8jg" level=info timestamp=2018-07-27T07:45:39.627387Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wjgw kind= uid=0e8bb636-9171-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:45:39.627526Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wjgw kind= uid=0e8bb636-9171-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:48:39.976237Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4wjgw\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4wjgw, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 0e8bb636-9171-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4wjgw" level=info timestamp=2018-07-27T07:48:40.116829Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixl4xq kind= uid=7a1e0f37-9171-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:48:40.116945Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixl4xq kind= uid=7a1e0f37-9171-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-k7vk7 Pod phase: Running level=info timestamp=2018-07-27T07:39:36.024363Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T07:39:36.030922Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.192557Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-27T07:39:36.192645Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-27T07:39:36.235523Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmirpl2r" level=info timestamp=2018-07-27T07:39:36.446983Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T07:39:36.447100Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.447261Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:36.447242Z pos=vm.go:678 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-27T07:39:36.447320Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.447362Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:36.447596Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.448951Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:36.449051Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.451369Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" Pod name: virt-handler-ttfvq Pod phase: Running level=info timestamp=2018-07-27T07:39:01.740230Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmiv4ntsphlqc_sock" level=error timestamp=2018-07-27T07:39:01.740273Z pos=cache.go:124 component=virt-handler reason="dial unix /var/run/kubevirt/sockets/kubevirt-test-default_testvmiv4ntsphlqc_sock: connect: connection refused" msg="failed to connect to cmd client socket" level=info timestamp=2018-07-27T07:39:01.740298Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmiwbgxn_sock" level=error timestamp=2018-07-27T07:39:01.740339Z pos=cache.go:124 component=virt-handler reason="dial unix /var/run/kubevirt/sockets/kubevirt-test-default_testvmiwbgxn_sock: connect: connection refused" msg="failed to connect to cmd client socket" level=info timestamp=2018-07-27T07:39:01.740364Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmiz4lsx_sock" level=error timestamp=2018-07-27T07:39:01.740406Z pos=cache.go:124 component=virt-handler reason="dial unix /var/run/kubevirt/sockets/kubevirt-test-default_testvmiz4lsx_sock: connect: connection refused" msg="failed to connect to cmd client socket" level=info timestamp=2018-07-27T07:39:01.836231Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T07:39:01.845527Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" level=info timestamp=2018-07-27T07:39:01.847594Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T07:39:01.936725Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid=f4db897a-916f-11e8-86a4-525500d15501 msg="No update processing required" level=info timestamp=2018-07-27T07:39:01.936966Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid=f4db897a-916f-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:18.418328Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:18.418466Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:18.418598Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:18.418719Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmixl4xq-pl5lx Pod phase: Running level=info timestamp=2018-07-27T07:48:43.983108Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T07:48:43.983428Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T07:48:43.985380Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T07:48:53.991180Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T07:48:54.042255Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmixl4xq" level=info timestamp=2018-07-27T07:48:54.043819Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T07:48:54.044010Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.475 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Killed VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:945 should be in Failed phase [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:946 Timed out after 90.005s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1049 ------------------------------ STEP: Starting a VirtualMachineInstance level=info timestamp=2018-07-27T07:48:40.151123Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmixl4xq-pl5lx" Pod name: disks-images-provider-tbdb6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zk9q6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-zn76g Pod phase: Running level=info timestamp=2018-07-27T07:53:30.454801Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:53:31.823881Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:53:38 http: TLS handshake error from 10.244.0.1:42082: EOF 2018/07/27 07:53:48 http: TLS handshake error from 10.244.0.1:42106: EOF 2018/07/27 07:53:58 http: TLS handshake error from 10.244.0.1:42130: EOF level=info timestamp=2018-07-27T07:54:00.483962Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:54:01.853348Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:54:08 http: TLS handshake error from 10.244.0.1:42154: EOF level=info timestamp=2018-07-27T07:54:10.282911Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-27T07:54:10.286916Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:54:18 http: TLS handshake error from 10.244.0.1:42178: EOF 2018/07/27 07:54:28 http: TLS handshake error from 10.244.0.1:42202: EOF level=info timestamp=2018-07-27T07:54:30.512214Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-27T07:54:31.880868Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/27 07:54:38 http: TLS handshake error from 10.244.0.1:42226: EOF Pod name: virt-api-7d79764579-zplcq Pod phase: Running 2018/07/27 07:52:59 http: TLS handshake error from 10.244.1.1:49734: EOF level=info timestamp=2018-07-27T07:53:04.997013Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:53:09 http: TLS handshake error from 10.244.1.1:49740: EOF 2018/07/27 07:53:19 http: TLS handshake error from 10.244.1.1:49746: EOF 2018/07/27 07:53:29 http: TLS handshake error from 10.244.1.1:49752: EOF level=info timestamp=2018-07-27T07:53:34.993318Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:53:39 http: TLS handshake error from 10.244.1.1:49758: EOF 2018/07/27 07:53:49 http: TLS handshake error from 10.244.1.1:49764: EOF 2018/07/27 07:53:59 http: TLS handshake error from 10.244.1.1:49770: EOF level=info timestamp=2018-07-27T07:54:05.003949Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:54:09 http: TLS handshake error from 10.244.1.1:49776: EOF 2018/07/27 07:54:19 http: TLS handshake error from 10.244.1.1:49782: EOF 2018/07/27 07:54:29 http: TLS handshake error from 10.244.1.1:49788: EOF level=info timestamp=2018-07-27T07:54:34.992608Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/27 07:54:39 http: TLS handshake error from 10.244.1.1:49794: EOF Pod name: virt-controller-7d57d96b65-6dx5k Pod phase: Running level=info timestamp=2018-07-27T07:39:38.674724Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivfqd7 kind= uid=37670322-9170-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:39:38.674841Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivfqd7 kind= uid=37670322-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:39:38.737495Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivfqd7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivfqd7" level=info timestamp=2018-07-27T07:39:38.765302Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivfqd7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivfqd7" level=info timestamp=2018-07-27T07:42:39.156065Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizw8jg kind= uid=a2fa4a5c-9170-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:42:39.157907Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizw8jg kind= uid=a2fa4a5c-9170-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:45:39.435501Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizw8jg\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmizw8jg, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a2fa4a5c-9170-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizw8jg" level=info timestamp=2018-07-27T07:45:39.627387Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wjgw kind= uid=0e8bb636-9171-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:45:39.627526Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wjgw kind= uid=0e8bb636-9171-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:48:39.976237Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4wjgw\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4wjgw, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 0e8bb636-9171-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4wjgw" level=info timestamp=2018-07-27T07:48:40.116829Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixl4xq kind= uid=7a1e0f37-9171-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:48:40.116945Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixl4xq kind= uid=7a1e0f37-9171-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-27T07:51:40.400950Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixl4xq\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmixl4xq, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 7a1e0f37-9171-11e8-86a4-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixl4xq" level=info timestamp=2018-07-27T07:51:40.574093Z pos=preset.go:139 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi28fch kind= uid=e5b0085b-9171-11e8-86a4-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-27T07:51:40.574271Z pos=preset.go:165 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi28fch kind= uid=e5b0085b-9171-11e8-86a4-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-bnbgh Pod phase: Running level=info timestamp=2018-07-27T06:28:37.027651Z pos=application.go:174 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-k7vk7 Pod phase: Running level=info timestamp=2018-07-27T07:39:36.024363Z pos=vm.go:389 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-27T07:39:36.030922Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.192557Z pos=vm.go:342 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-27T07:39:36.192645Z pos=vm.go:383 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-27T07:39:36.235523Z pos=vm.go:540 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="grace period expired, killing deleted VirtualMachineInstance testvmirpl2r" level=info timestamp=2018-07-27T07:39:36.446983Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-07-27T07:39:36.447100Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind= uid=2b6e62a5-9170-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.447261Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:36.447242Z pos=vm.go:678 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-27T07:39:36.447320Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.447362Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:36.447596Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.448951Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:36.449051Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirpl2r kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:36.451369Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" Pod name: virt-handler-ttfvq Pod phase: Running level=info timestamp=2018-07-27T07:39:01.740230Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmiv4ntsphlqc_sock" level=error timestamp=2018-07-27T07:39:01.740273Z pos=cache.go:124 component=virt-handler reason="dial unix /var/run/kubevirt/sockets/kubevirt-test-default_testvmiv4ntsphlqc_sock: connect: connection refused" msg="failed to connect to cmd client socket" level=info timestamp=2018-07-27T07:39:01.740298Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmiwbgxn_sock" level=error timestamp=2018-07-27T07:39:01.740339Z pos=cache.go:124 component=virt-handler reason="dial unix /var/run/kubevirt/sockets/kubevirt-test-default_testvmiwbgxn_sock: connect: connection refused" msg="failed to connect to cmd client socket" level=info timestamp=2018-07-27T07:39:01.740364Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmiz4lsx_sock" level=error timestamp=2018-07-27T07:39:01.740406Z pos=cache.go:124 component=virt-handler reason="dial unix /var/run/kubevirt/sockets/kubevirt-test-default_testvmiz4lsx_sock: connect: connection refused" msg="failed to connect to cmd client socket" level=info timestamp=2018-07-27T07:39:01.836231Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-27T07:39:01.845527Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" level=info timestamp=2018-07-27T07:39:01.847594Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-27T07:39:01.936725Z pos=vm.go:392 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid=f4db897a-916f-11e8-86a4-525500d15501 msg="No update processing required" level=info timestamp=2018-07-27T07:39:01.936966Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid=f4db897a-916f-11e8-86a4-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:18.418328Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:18.418466Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-27T07:39:18.418598Z pos=vm.go:386 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-27T07:39:18.418719Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmidv2jv kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi28fch-mm7fv Pod phase: Running level=info timestamp=2018-07-27T07:51:44.305244Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-07-27T07:51:44.305550Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-07-27T07:51:44.307548Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-07-27T07:51:54.316591Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-07-27T07:51:54.355143Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi28fch" level=info timestamp=2018-07-27T07:51:54.357419Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-07-27T07:51:54.357750Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.468 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:48 Killed VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:945 should be left alone by virt-handler [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:973 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1049 ------------------------------ STEP: Starting a VirtualMachineInstance level=info timestamp=2018-07-27T07:51:40.638639Z pos=utils.go:243 component=tests msg="Created virtual machine pod virt-launcher-testvmi28fch-mm7fv" S [SKIPPING] in Spec Setup (BeforeEach) [0.010 seconds] Templates /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:42 Launching VMI from VM Template [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:60 with given Fedora Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:193 should succeed to generate a VM JSON file using oc-process command /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:150 Skip test that requires oc binary /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1393 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.001 seconds] Templates /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:42 Launching VMI from VM Template [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:60 with given Fedora Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:193 with given VM JSON from the Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:152 should succeed to create a VM using oc-create command /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:156 Skip test that requires oc binary /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1393 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.001 seconds] Templates /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:42 Launching VMI from VM Template [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:60 with given Fedora Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:193 with given VM JSON from the Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:152 with given VM from the VM JSON /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:158 should succeed to launch a VMI using oc-patch command /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:161 Skip test that requires oc binary /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1393 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.001 seconds] Templates /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:42 Launching VMI from VM Template [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:60 with given Fedora Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:193 with given VM JSON from the Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:152 with given VM from the VM JSON /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:158 with given VMI from the VM /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:163 should succeed to terminate the VMI using oc-patch command /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:166 Skip test that requires oc binary /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1393 ------------------------------ •• ------------------------------ • [SLOW TEST:17.128 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should update VirtualMachine once VMIs are up /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:195 ------------------------------ •• ------------------------------ • [SLOW TEST:47.473 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should recreate VirtualMachineInstance if it gets deleted /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:245 ------------------------------ • [SLOW TEST:76.705 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should recreate VirtualMachineInstance if the VirtualMachineInstance's pod gets deleted /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:265 ------------------------------ • [SLOW TEST:43.469 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should stop VirtualMachineInstance if running set to false /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:325 ------------------------------ panic: test timed out after 1h30m0s goroutine 12480 [running]: testing.(*M).startAlarm.func1() /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1240 +0xfc created by time.goFunc /gimme/.gimme/versions/go1.10.linux.amd64/src/time/sleep.go:172 +0x44 goroutine 1 [chan receive, 90 minutes]: testing.(*T).Run(0xc420856d20, 0x139cbc3, 0x9, 0x142ef00, 0x4801e6) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:825 +0x301 testing.runTests.func1(0xc420856c30) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1063 +0x64 testing.tRunner(0xc420856c30, 0xc420891df8) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:777 +0xd0 testing.runTests(0xc42050d900, 0x1d2fa50, 0x1, 0x1, 0x412009) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1061 +0x2c4 testing.(*M).Run(0xc420611200, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:978 +0x171 main.main() _testmain.go:44 +0x151 goroutine 5 [chan receive]: kubevirt.io/kubevirt/vendor/github.com/golang/glog.(*loggingT).flushDaemon(0x1d5b280) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/golang/glog/glog.go:879 +0x8b created by kubevirt.io/kubevirt/vendor/github.com/golang/glog.init.0 /root/go/src/kubevirt.io/kubevirt/vendor/github.com/golang/glog/glog.go:410 +0x203 goroutine 6 [syscall, 90 minutes]: os/signal.signal_recv(0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/sigqueue.go:139 +0xa6 os/signal.loop() /gimme/.gimme/versions/go1.10.linux.amd64/src/os/signal/signal_unix.go:22 +0x22 created by os/signal.init.0 /gimme/.gimme/versions/go1.10.linux.amd64/src/os/signal/signal_unix.go:28 +0x41 goroutine 11 [select]: kubevirt.io/kubevirt/vendor/github.com/onsi/gomega/internal/asyncassertion.(*AsyncAssertion).match(0xc4208c2840, 0x14c2900, 0x1d79938, 0x412801, 0xc4206208f0, 0x1, 0x1, 0xc4206208f0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/gomega/internal/asyncassertion/async_assertion.go:139 +0x2e6 kubevirt.io/kubevirt/vendor/github.com/onsi/gomega/internal/asyncassertion.(*AsyncAssertion).Should(0xc4208c2840, 0x14c2900, 0x1d79938, 0xc4206208f0, 0x1, 0x1, 0xc4208c2840) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/gomega/internal/asyncassertion/async_assertion.go:48 +0x62 kubevirt.io/kubevirt/tests_test.glob..func13.3.3(0xc420e00780, 0xc420e00c80) /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:183 +0x480 kubevirt.io/kubevirt/tests_test.glob..func13.3.10() /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:342 +0x118 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*runner).runSync(0xc4207bd440, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/runner.go:113 +0x9c kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*runner).run(0xc4207bd440, 0x3, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/runner.go:64 +0x13e kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*ItNode).Run(0xc42042aba0, 0x14b4b80, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/it_node.go:26 +0x7f kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec.(*Spec).runSample(0xc4204b4c30, 0x0, 0x14b4b80, 0xc420055540) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec/spec.go:203 +0x648 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec.(*Spec).Run(0xc4204b4c30, 0x14b4b80, 0xc420055540) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec/spec.go:138 +0xff kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).runSpec(0xc420109b80, 0xc4204b4c30, 0x0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:200 +0x10d kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).runSpecs(0xc420109b80, 0x1) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:170 +0x329 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).Run(0xc420109b80, 0xb) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:66 +0x11b kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/suite.(*Suite).Run(0xc4200ccaf0, 0x7f4723200b50, 0xc420856d20, 0x139f1a6, 0xb, 0xc42050d940, 0x2, 0x2, 0x14d1400, 0xc420055540, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/suite/suite.go:62 +0x27c kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo.RunSpecsWithCustomReporters(0x14b5be0, 0xc420856d20, 0x139f1a6, 0xb, 0xc42050d920, 0x2, 0x2, 0x2) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/ginkgo_dsl.go:221 +0x258 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo.RunSpecsWithDefaultAndCustomReporters(0x14b5be0, 0xc420856d20, 0x139f1a6, 0xb, 0xc4202517b0, 0x1, 0x1, 0x1) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/ginkgo_dsl.go:209 +0xab kubevirt.io/kubevirt/tests_test.TestTests(0xc420856d20) /root/go/src/kubevirt.io/kubevirt/tests/tests_suite_test.go:43 +0xaa testing.tRunner(0xc420856d20, 0x142ef00) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:777 +0xd0 created by testing.(*T).Run /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:824 +0x2e0 goroutine 12 [chan receive, 90 minutes]: kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).registerForInterrupts(0xc420109b80, 0xc4200bfc80) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:223 +0xd1 created by kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).Run /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:60 +0x88 goroutine 13 [select, 90 minutes, locked to thread]: runtime.gopark(0x14310d8, 0x0, 0x13996e5, 0x6, 0x18, 0x1) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/proc.go:291 +0x11a runtime.selectgo(0xc42047ff50, 0xc4200bfd40) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/select.go:392 +0xe50 runtime.ensureSigM.func1() /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/signal_unix.go:549 +0x1f4 runtime.goexit() /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/asm_amd64.s:2361 +0x1 goroutine 22 [IO wait]: internal/poll.runtime_pollWait(0x7f47231e9f00, 0x72, 0xc420089850) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/netpoll.go:173 +0x57 internal/poll.(*pollDesc).wait(0xc420959698, 0x72, 0xffffffffffffff00, 0x14b6da0, 0x1c467d0) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_poll_runtime.go:85 +0x9b internal/poll.(*pollDesc).waitRead(0xc420959698, 0xc4206ec000, 0x8000, 0x8000) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_poll_runtime.go:90 +0x3d internal/poll.(*FD).Read(0xc420959680, 0xc4206ec000, 0x8000, 0x8000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_unix.go:157 +0x17d net.(*netFD).Read(0xc420959680, 0xc4206ec000, 0x8000, 0x8000, 0x0, 0x8, 0x7ffb) /gimme/.gimme/versions/go1.10.linux.amd64/src/net/fd_unix.go:202 +0x4f net.(*conn).Read(0xc420122630, 0xc4206ec000, 0x8000, 0x8000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/net/net.go:176 +0x6a crypto/tls.(*block).readFromUntil(0xc42038c2d0, 0x7f4723200e78, 0xc420122630, 0x5, 0xc420122630, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:493 +0x96 crypto/tls.(*Conn).readRecord(0xc4200d7c00, 0x1431217, 0xc4200d7d20, 0x20) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:595 +0xe0 crypto/tls.(*Conn).Read(0xc4200d7c00, 0xc4203b1000, 0x1000, 0x1000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:1156 +0x100 bufio.(*Reader).Read(0xc4200a6300, 0xc42033e118, 0x9, 0x9, 0xc420718118, 0xc42095fb20, 0xc420089d10) /gimme/.gimme/versions/go1.10.linux.amd64/src/bufio/bufio.go:216 +0x238 io.ReadAtLeast(0x14b3980, 0xc4200a6300, 0xc42033e118, 0x9, 0x9, 0x9, 0xc420089ce0, 0xc420089ce0, 0x406614) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/io.go:309 +0x86 io.ReadFull(0x14b3980, 0xc4200a6300, 0xc42033e118, 0x9, 0x9, 0xc4207180c0, 0xc420089d10, 0xc400005701) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/io.go:327 +0x58 kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.readFrameHeader(0xc42033e118, 0x9, 0x9, 0x14b3980, 0xc4200a6300, 0x0, 0xc400000000, 0x7efa2d, 0xc420089fb0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/frame.go:237 +0x7b kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*Framer).ReadFrame(0xc42033e0e0, 0xc420395c80, 0x0, 0x0, 0x0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/frame.go:492 +0xa4 kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*clientConnReadLoop).run(0xc420089fb0, 0x142fe58, 0xc4204797b0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:1428 +0x8e kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*ClientConn).readLoop(0xc4200d4000) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:1354 +0x76 created by kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*Transport).newClientConn /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:579 +0x651 goroutine 5014 [chan send, 49 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc4208295c0) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 4732 [chan send, 52 minutes]: kubevirt.io/kubevirt/tests_test.glob..func23.1.2.1.1(0x14ef100, 0xc4203c1ec0, 0xc4201223c8, 0xc4209190e0, 0xc42000e828, 0xc42000e838) /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:81 +0x138 created by kubevirt.io/kubevirt/tests_test.glob..func23.1.2.1 /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:73 +0x386 goroutine 5140 [chan send, 47 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc4208035f0) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 make: *** [functest] Error 2 + make cluster-down ./cluster/down.sh