+ export WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release + WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release + [[ k8s-1.10.4-release =~ openshift-.* ]] + [[ k8s-1.10.4-release =~ .*-1.10.4-.* ]] + export KUBEVIRT_PROVIDER=k8s-1.10.4 + KUBEVIRT_PROVIDER=k8s-1.10.4 + export KUBEVIRT_NUM_NODES=2 + KUBEVIRT_NUM_NODES=2 + export NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + export NAMESPACE=kube-system + NAMESPACE=kube-system + trap '{ make cluster-down; }' EXIT SIGINT SIGTERM SIGSTOP + make cluster-down ./cluster/down.sh + make cluster-up ./cluster/up.sh Downloading ....... Downloading ....... 2018/07/31 13:44:29 Waiting for host: 192.168.66.101:22 2018/07/31 13:44:32 Problem with dial: dial tcp 192.168.66.101:22: getsockopt: no route to host. Sleeping 5s 2018/07/31 13:44:44 Connected to tcp://192.168.66.101:22 + kubeadm init --config /etc/kubernetes/kubeadm.conf [init] Using Kubernetes version: v1.10.4 [init] Using Authorization modes: [Node RBAC] [preflight] Running pre-flight checks. [WARNING FileExisting-crictl]: crictl not found in system path Suggestion: go get github.com/kubernetes-incubator/cri-tools/cmd/crictl Flag --admission-control has been deprecated, Use --enable-admission-plugins or --disable-admission-plugins instead. Will be removed in a future version. [certificates] Generated ca certificate and key. [certificates] Generated apiserver certificate and key. [certificates] apiserver serving cert is signed for DNS names [node01 kubernetes kubernetes.default kubernetes.default.svc kubernetes.default.svc.cluster.local] and IPs [10.96.0.1 192.168.66.101] [certificates] Generated apiserver-kubelet-client certificate and key. [certificates] Generated sa key and public key. [certificates] Generated front-proxy-ca certificate and key. [certificates] Generated front-proxy-client certificate and key. [certificates] Generated etcd/ca certificate and key. [certificates] Generated etcd/server certificate and key. [certificates] etcd/server serving cert is signed for DNS names [localhost] and IPs [127.0.0.1] [certificates] Generated etcd/peer certificate and key. [certificates] etcd/peer serving cert is signed for DNS names [node01] and IPs [192.168.66.101] [certificates] Generated etcd/healthcheck-client certificate and key. [certificates] Generated apiserver-etcd-client certificate and key. [certificates] Valid certificates and keys now exist in "/etc/kubernetes/pki" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/admin.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/kubelet.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/controller-manager.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/scheduler.conf" [controlplane] Wrote Static Pod manifest for component kube-apiserver to "/etc/kubernetes/manifests/kube-apiserver.yaml" [controlplane] Wrote Static Pod manifest for component kube-controller-manager to "/etc/kubernetes/manifests/kube-controller-manager.yaml" [controlplane] Wrote Static Pod manifest for component kube-scheduler to "/etc/kubernetes/manifests/kube-scheduler.yaml" [etcd] Wrote Static Pod manifest for a local etcd instance to "/etc/kubernetes/manifests/etcd.yaml" [init] Waiting for the kubelet to boot up the control plane as Static Pods from directory "/etc/kubernetes/manifests". [init] This might take a minute or longer if the control plane images have to be pulled. [apiclient] All control plane components are healthy after 33.512451 seconds [uploadconfig] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace [markmaster] Will mark node node01 as master by adding a label and a taint [markmaster] Master node01 tainted and labelled with key/value: node-role.kubernetes.io/master="" [bootstraptoken] Using token: abcdef.1234567890123456 [bootstraptoken] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials [bootstraptoken] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token [bootstraptoken] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster [bootstraptoken] Creating the "cluster-info" ConfigMap in the "kube-public" namespace [addons] Applied essential addon: kube-dns [addons] Applied essential addon: kube-proxy Your Kubernetes master has initialized successfully! To start using your cluster, you need to run the following as a regular user: mkdir -p $HOME/.kube sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config sudo chown $(id -u):$(id -g) $HOME/.kube/config You should now deploy a pod network to the cluster. Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at: https://kubernetes.io/docs/concepts/cluster-administration/addons/ You can now join any number of machines by running the following on each node as root: kubeadm join 192.168.66.101:6443 --token abcdef.1234567890123456 --discovery-token-ca-cert-hash sha256:2b23d18b8cc374c9fe15b661891c3f9c620da3985a0ef37bcda2add34901fce2 + kubectl --kubeconfig=/etc/kubernetes/admin.conf apply -f https://raw.githubusercontent.com/coreos/flannel/v0.9.1/Documentation/kube-flannel.yml clusterrole.rbac.authorization.k8s.io "flannel" created clusterrolebinding.rbac.authorization.k8s.io "flannel" created serviceaccount "flannel" created configmap "kube-flannel-cfg" created daemonset.extensions "kube-flannel-ds" created + kubectl --kubeconfig=/etc/kubernetes/admin.conf taint nodes node01 node-role.kubernetes.io/master:NoSchedule- node "node01" untainted 2018/07/31 13:45:33 Waiting for host: 192.168.66.102:22 2018/07/31 13:45:36 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: no route to host. Sleeping 5s 2018/07/31 13:45:48 Connected to tcp://192.168.66.102:22 + kubeadm join --token abcdef.1234567890123456 192.168.66.101:6443 --ignore-preflight-errors=all --discovery-token-unsafe-skip-ca-verification=true [preflight] Running pre-flight checks. [discovery] Trying to connect to API Server "192.168.66.101:6443" [WARNING FileExisting-crictl]: crictl not found in system path Suggestion: go get github.com/kubernetes-incubator/cri-tools/cmd/crictl [discovery] Created cluster-info discovery client, requesting info from "https://192.168.66.101:6443" [discovery] Cluster info signature and contents are valid and no TLS pinning was specified, will use API Server "192.168.66.101:6443" [discovery] Successfully established connection with API Server "192.168.66.101:6443" This node has joined the cluster: * Certificate signing request was sent to master and a response was received. * The Kubelet was informed of the new secure connection details. Run 'kubectl get nodes' on the master to see this node join the cluster. Sending file modes: C0755 39611920 kubectl Sending file modes: C0600 5450 admin.conf Cluster "kubernetes" set. Cluster "kubernetes" set. + set +e + kubectl get nodes --no-headers + cluster/kubectl.sh get nodes --no-headers node01 Ready master 46s v1.10.4 node02 Ready 17s v1.10.4 + kubectl_rc=0 + '[' 0 -ne 0 ']' ++ kubectl get nodes --no-headers ++ grep NotReady ++ cluster/kubectl.sh get nodes --no-headers + '[' -n '' ']' + set -e + echo 'Nodes are ready:' Nodes are ready: + kubectl get nodes + cluster/kubectl.sh get nodes NAME STATUS ROLES AGE VERSION node01 Ready master 48s v1.10.4 node02 Ready 19s v1.10.4 + make cluster-sync ./cluster/build.sh Building ... sha256:dcf2b21fa2ed11dcf9dbba21b1cca0ee3fad521a0e9aee61c06d0b0b66a4b200 go version go1.10 linux/amd64 go version go1.10 linux/amd64 make[1]: Entering directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt' hack/dockerized "./hack/check.sh && KUBEVIRT_VERSION= ./hack/build-go.sh install " && ./hack/build-copy-artifacts.sh sha256:dcf2b21fa2ed11dcf9dbba21b1cca0ee3fad521a0e9aee61c06d0b0b66a4b200 go version go1.10 linux/amd64 go version go1.10 linux/amd64 find: '/root/go/src/kubevirt.io/kubevirt/_out/cmd': No such file or directory Compiling tests... compiled tests.test hack/build-docker.sh build Sending build context to Docker daemon 40.39 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-controller ---> Using cache ---> b00c84523b53 Step 4/8 : WORKDIR /home/virt-controller ---> Using cache ---> b76b8bd8cd39 Step 5/8 : USER 1001 ---> Using cache ---> b6d9ad9ed232 Step 6/8 : COPY virt-controller /usr/bin/virt-controller ---> 9fd00dca9a95 Removing intermediate container 41d9d6bcb84c Step 7/8 : ENTRYPOINT /usr/bin/virt-controller ---> Running in 0c82decc7d89 ---> cf1a73c1d517 Removing intermediate container 0c82decc7d89 Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release0" '' "virt-controller" '' ---> Running in 7c6aa394834a ---> 7bd5a8056d71 Removing intermediate container 7c6aa394834a Successfully built 7bd5a8056d71 Sending build context to Docker daemon 43.32 MB Step 1/9 : FROM kubevirt/libvirt:4.2.0 ---> 5f0bfe81a3e0 Step 2/9 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 945996802736 Step 3/9 : RUN dnf -y install socat genisoimage && dnf -y clean all && test $(id -u qemu) = 107 # make sure that the qemu user really is 107 ---> Using cache ---> 1dcd22d08d0e Step 4/9 : COPY virt-launcher /usr/bin/virt-launcher ---> 05db5ef358ce Removing intermediate container e56e5e7d61bd Step 5/9 : RUN setcap CAP_NET_BIND_SERVICE=+eip /usr/bin/qemu-system-x86_64 ---> Running in 7343e6b50cde  ---> c95320ea5c5b Removing intermediate container 7343e6b50cde Step 6/9 : RUN mkdir -p /usr/share/kubevirt/virt-launcher ---> Running in 743ce3904623  ---> 81bd4e096357 Removing intermediate container 743ce3904623 Step 7/9 : COPY sock-connector /usr/share/kubevirt/virt-launcher/ ---> a2368518489a Removing intermediate container a529ee5cd3f3 Step 8/9 : ENTRYPOINT /usr/bin/virt-launcher ---> Running in 8fcb039ddec7 ---> 143e74fa578a Removing intermediate container 8fcb039ddec7 Step 9/9 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release0" '' "virt-launcher" '' ---> Running in af87e9bf5013 ---> 959f79fcacdf Removing intermediate container af87e9bf5013 Successfully built 959f79fcacdf Sending build context to Docker daemon 41.69 MB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/5 : COPY virt-handler /usr/bin/virt-handler ---> 3b3a7adb5277 Removing intermediate container 48cdd4ddad08 Step 4/5 : ENTRYPOINT /usr/bin/virt-handler ---> Running in 4a3b9d65aca1 ---> 72ac8565d012 Removing intermediate container 4a3b9d65aca1 Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release0" '' "virt-handler" '' ---> Running in 67b032088f18 ---> 76419c958c56 Removing intermediate container 67b032088f18 Successfully built 76419c958c56 Sending build context to Docker daemon 38.81 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-api ---> Using cache ---> ed1ebf600ee1 Step 4/8 : WORKDIR /home/virt-api ---> Using cache ---> 0769dad023e5 Step 5/8 : USER 1001 ---> Using cache ---> 0cb65afb0c2b Step 6/8 : COPY virt-api /usr/bin/virt-api ---> cefbcc53b8e4 Removing intermediate container 3005898d4da2 Step 7/8 : ENTRYPOINT /usr/bin/virt-api ---> Running in f8149348ad19 ---> dc5529446610 Removing intermediate container f8149348ad19 Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release0" '' "virt-api" '' ---> Running in 5073c1139952 ---> c888ecdbaf0a Removing intermediate container 5073c1139952 Successfully built c888ecdbaf0a Sending build context to Docker daemon 4.096 kB Step 1/7 : FROM fedora:28 ---> cc510acfcd70 Step 2/7 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/7 : ENV container docker ---> Using cache ---> 62847a2a1fa8 Step 4/7 : RUN mkdir -p /images/custom /images/alpine && truncate -s 64M /images/custom/disk.img && curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /images/alpine/disk.img ---> Using cache ---> 02134835a6aa Step 5/7 : ADD entrypoint.sh / ---> Using cache ---> ec0843818da7 Step 6/7 : CMD /entrypoint.sh ---> Using cache ---> 754029bb4bd2 Step 7/7 : LABEL "disks-images-provider" '' "kubevirt-functional-tests-k8s-1.10.4-release0" '' ---> Using cache ---> 9508792c39fe Successfully built 9508792c39fe Sending build context to Docker daemon 2.56 kB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/5 : ENV container docker ---> Using cache ---> 62847a2a1fa8 Step 4/5 : RUN dnf -y install procps-ng nmap-ncat && dnf -y clean all ---> Using cache ---> 207487abe7b2 Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release0" '' "vm-killer" '' ---> Using cache ---> f73a0a60268a Successfully built f73a0a60268a Sending build context to Docker daemon 5.12 kB Step 1/7 : FROM debian:sid ---> 68f33cf86aab Step 2/7 : MAINTAINER "David Vossel" \ ---> Using cache ---> 5734d749eb5c Step 3/7 : ENV container docker ---> Using cache ---> f8775a77966f Step 4/7 : RUN apt-get update && apt-get install -y bash curl bzip2 qemu-utils && mkdir -p /disk && rm -rf /var/lib/apt/lists/* ---> Using cache ---> 1a40cf222a61 Step 5/7 : ADD entry-point.sh / ---> Using cache ---> 77b545d92fe7 Step 6/7 : CMD /entry-point.sh ---> Using cache ---> dfe20d463305 Step 7/7 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release0" '' "registry-disk-v1alpha" '' ---> Using cache ---> 2b10738ed57f Successfully built 2b10738ed57f Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33278/kubevirt/registry-disk-v1alpha:devel ---> 2b10738ed57f Step 2/4 : MAINTAINER "David Vossel" \ ---> Using cache ---> a84e691d223e Step 3/4 : RUN curl https://download.cirros-cloud.net/0.4.0/cirros-0.4.0-x86_64-disk.img > /disk/cirros.img ---> Using cache ---> 2802af14bf7e Step 4/4 : LABEL "cirros-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.10.4-release0" '' ---> Using cache ---> 68b2972a67e8 Successfully built 68b2972a67e8 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33278/kubevirt/registry-disk-v1alpha:devel ---> 2b10738ed57f Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> e353dde3d14f Step 3/4 : RUN curl -g -L https://download.fedoraproject.org/pub/fedora/linux/releases/27/CloudImages/x86_64/images/Fedora-Cloud-Base-27-1.6.x86_64.qcow2 > /disk/fedora.qcow2 ---> Using cache ---> 5b72cb1e8314 Step 4/4 : LABEL "fedora-cloud-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.10.4-release0" '' ---> Using cache ---> 74cc3a5f5187 Successfully built 74cc3a5f5187 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33278/kubevirt/registry-disk-v1alpha:devel ---> 2b10738ed57f Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> e353dde3d14f Step 3/4 : RUN curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /disk/alpine.iso ---> Using cache ---> 394554114a84 Step 4/4 : LABEL "alpine-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.10.4-release0" '' ---> Using cache ---> 0c711d06e816 Successfully built 0c711d06e816 Sending build context to Docker daemon 35.59 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virtctl ---> Using cache ---> 985fe391c056 Step 4/8 : WORKDIR /home/virtctl ---> Using cache ---> 3b2cae8ac543 Step 5/8 : USER 1001 ---> Using cache ---> 0c06e5b4a900 Step 6/8 : COPY subresource-access-test /subresource-access-test ---> cfd8355faad3 Removing intermediate container b263fbae8eb2 Step 7/8 : ENTRYPOINT /subresource-access-test ---> Running in b3d5e82402fd ---> 8fc038d050df Removing intermediate container b3d5e82402fd Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release0" '' "subresource-access-test" '' ---> Running in 156357de84d4 ---> d0f45725a262 Removing intermediate container 156357de84d4 Successfully built d0f45725a262 Sending build context to Docker daemon 3.072 kB Step 1/9 : FROM fedora:28 ---> cc510acfcd70 Step 2/9 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/9 : ENV container docker ---> Using cache ---> 62847a2a1fa8 Step 4/9 : RUN dnf -y install make git gcc && dnf -y clean all ---> Using cache ---> d3456b1644b1 Step 5/9 : ENV GIMME_GO_VERSION 1.9.2 ---> Using cache ---> 0ba81fddbba1 Step 6/9 : RUN mkdir -p /gimme && curl -sL https://raw.githubusercontent.com/travis-ci/gimme/master/gimme | HOME=/gimme bash >> /etc/profile.d/gimme.sh ---> Using cache ---> 5d33abe3f819 Step 7/9 : ENV GOPATH "/go" GOBIN "/usr/bin" ---> Using cache ---> 783826523be1 Step 8/9 : RUN mkdir -p /go && source /etc/profile.d/gimme.sh && go get github.com/masterzen/winrm-cli ---> Using cache ---> 711bc8d15952 Step 9/9 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release0" '' "winrmcli" '' ---> Using cache ---> 24f824ca6fdb Successfully built 24f824ca6fdb Sending build context to Docker daemon 36.8 MB Step 1/5 : FROM fedora:27 ---> 9110ae7f579f Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> e3238544ad97 Step 3/5 : COPY example-hook-sidecar /example-hook-sidecar ---> 79693d2c4601 Removing intermediate container b91d015177f8 Step 4/5 : ENTRYPOINT /example-hook-sidecar ---> Running in 5b3b340e0208 ---> 6d1a4a650f80 Removing intermediate container 5b3b340e0208 Step 5/5 : LABEL "example-hook-sidecar" '' "kubevirt-functional-tests-k8s-1.10.4-release0" '' ---> Running in 7e47ff655691 ---> 163d31f166ed Removing intermediate container 7e47ff655691 Successfully built 163d31f166ed hack/build-docker.sh push The push refers to a repository [localhost:33278/kubevirt/virt-controller] 0124af572359: Preparing aa89340cf7a8: Preparing 891e1e4ef82a: Preparing aa89340cf7a8: Pushed 0124af572359: Pushed 891e1e4ef82a: Pushed devel: digest: sha256:6f6734ba69f6d1dacd121bf2351254d78e9f62daa808f617d6beb6037afa72b7 size: 949 The push refers to a repository [localhost:33278/kubevirt/virt-launcher] ab36540f7932: Preparing 56439ba2e841: Preparing ce28e4122119: Preparing 6f0e90135988: Preparing af293cb2890d: Preparing da38cf808aa5: Preparing b83399358a92: Preparing 186d8b3e4fd8: Preparing fa6154170bf5: Preparing 5eefb9960a36: Preparing 891e1e4ef82a: Preparing b83399358a92: Waiting 891e1e4ef82a: Waiting 5eefb9960a36: Waiting 186d8b3e4fd8: Waiting fa6154170bf5: Waiting 56439ba2e841: Pushed ab36540f7932: Pushed da38cf808aa5: Pushed b83399358a92: Pushed fa6154170bf5: Pushed 186d8b3e4fd8: Pushed 891e1e4ef82a: Mounted from kubevirt/virt-controller ce28e4122119: Pushed af293cb2890d: Pushed 6f0e90135988: Pushed 5eefb9960a36: Pushed devel: digest: sha256:e0c57be71c1ac1f4bdac27970796f0b67877ba3d879a12440b221eddaf3bcf27 size: 2620 The push refers to a repository [localhost:33278/kubevirt/virt-handler] 294a05e048ae: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-launcher 294a05e048ae: Pushed devel: digest: sha256:bbc086878a8270ddbc6db2061857df821f9b52429fc461293782311f990290ec size: 741 The push refers to a repository [localhost:33278/kubevirt/virt-api] eb125877854a: Preparing 82fc744c99b4: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-handler 82fc744c99b4: Pushed eb125877854a: Pushed devel: digest: sha256:8201d4b83007eb2e583c083d9ce90cf172d845c3776b92a8727fe99ed3c8e5dd size: 948 The push refers to a repository [localhost:33278/kubevirt/disks-images-provider] 71ad31feb2c5: Preparing 21d4b721776e: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Waiting 71ad31feb2c5: Pushed 891e1e4ef82a: Mounted from kubevirt/virt-api 21d4b721776e: Pushed devel: digest: sha256:c9707b33625230adba24ef057288743b19fa5fed6719ba2905ae9f09b8a414c6 size: 948 The push refers to a repository [localhost:33278/kubevirt/vm-killer] c4cfadeeaf5f: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/disks-images-provider c4cfadeeaf5f: Pushed devel: digest: sha256:c2ccd152e32de4913556e8f0b5ef32096c4e00009c21b8801f89f15698e5d591 size: 740 The push refers to a repository [localhost:33278/kubevirt/registry-disk-v1alpha] 661cce8d8e52: Preparing 41e0baba3077: Preparing 25edbec0eaea: Preparing 661cce8d8e52: Pushed 41e0baba3077: Pushed 25edbec0eaea: Pushed devel: digest: sha256:2722e40221b319369587bab1209e382a1e1df29d2bab78c804f54a6d7d97cd73 size: 948 The push refers to a repository [localhost:33278/kubevirt/cirros-registry-disk-demo] 3309e89c39ce: Preparing 661cce8d8e52: Preparing 41e0baba3077: Preparing 25edbec0eaea: Preparing 25edbec0eaea: Mounted from kubevirt/registry-disk-v1alpha 661cce8d8e52: Mounted from kubevirt/registry-disk-v1alpha 41e0baba3077: Mounted from kubevirt/registry-disk-v1alpha 3309e89c39ce: Pushed devel: digest: sha256:37bb0d4295e8249513017b764f0c33b7ec8e2e33ad9ec9535d284ec4ae0bdb7b size: 1160 The push refers to a repository [localhost:33278/kubevirt/fedora-cloud-registry-disk-demo] f17b8b191fd6: Preparing 661cce8d8e52: Preparing 41e0baba3077: Preparing 25edbec0eaea: Preparing 661cce8d8e52: Mounted from kubevirt/cirros-registry-disk-demo 25edbec0eaea: Mounted from kubevirt/cirros-registry-disk-demo 41e0baba3077: Mounted from kubevirt/cirros-registry-disk-demo f17b8b191fd6: Pushed devel: digest: sha256:404f7a3f88683d3e3d8d910037f9f0e73136cfd2f96b4a20d656e11dfe6945e0 size: 1161 The push refers to a repository [localhost:33278/kubevirt/alpine-registry-disk-demo] 0840765381ac: Preparing 661cce8d8e52: Preparing 41e0baba3077: Preparing 25edbec0eaea: Preparing 25edbec0eaea: Mounted from kubevirt/fedora-cloud-registry-disk-demo 41e0baba3077: Mounted from kubevirt/fedora-cloud-registry-disk-demo 661cce8d8e52: Mounted from kubevirt/fedora-cloud-registry-disk-demo 0840765381ac: Pushed devel: digest: sha256:1b571982fea4a8970550e9c0ce762a5e9a9f43cd39cb7384d99423e83e36609a size: 1160 The push refers to a repository [localhost:33278/kubevirt/subresource-access-test] 77f7dae16586: Preparing 25cb73590a9d: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/vm-killer 25cb73590a9d: Pushed 77f7dae16586: Pushed devel: digest: sha256:0e155c0e5377b957b5ba0a5eee5d4188e7f9d9fbcd0968f90dac7f0c0800b257 size: 948 The push refers to a repository [localhost:33278/kubevirt/winrmcli] f8083e002d0b: Preparing 53c709abc882: Preparing 9ca98a0f492b: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/subresource-access-test f8083e002d0b: Pushed 9ca98a0f492b: Pushed 53c709abc882: Pushed devel: digest: sha256:ea9d797b01af2846de597ff4d0beb5a62ff04857b6a4ad7278a3986ac150d672 size: 1165 The push refers to a repository [localhost:33278/kubevirt/example-hook-sidecar] e2d2253add80: Preparing 39bae602f753: Preparing e2d2253add80: Pushed 39bae602f753: Pushed devel: digest: sha256:7ba2000e6876b8d0298e551f12f95028d44aa9416308bad8bda6b9ed58b4a8de size: 740 make[1]: Leaving directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt' Done ./cluster/clean.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.10.4 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.10.4-release ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.10.4-release0 ++ job_prefix=kubevirt-functional-tests-k8s-1.10.4-release0 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-174-gc58ac2f ++ KUBEVIRT_VERSION=v0.7.0-174-gc58ac2f + source cluster/k8s-1.10.4/provider.sh ++ set -e ++ image=k8s-1.10.4@sha256:09ac918cc16f13a5d0af51d4c98e3e25cbf4f97b7b32fe18ec61b32f04ca1009 ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.10.4 ++ source hack/config-default.sh source hack/config-k8s-1.10.4.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.10.4.sh ++ source hack/config-provider-k8s-1.10.4.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.4/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.4/.kubectl +++ docker_prefix=localhost:33278/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Cleaning up ...' Cleaning up ... + cluster/kubectl.sh get vmis --all-namespaces -o=custom-columns=NAME:.metadata.name,NAMESPACE:.metadata.namespace,FINALIZERS:.metadata.finalizers --no-headers + grep foregroundDeleteVirtualMachine + read p error: the server doesn't have a resource type "vmis" + _kubectl delete ds -l kubevirt.io -n kube-system --cascade=false --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=libvirt --force --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=virt-handler --force --grace-period 0 No resources found + namespaces=(default ${namespace}) + for i in '${namespaces[@]}' + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete deployment -l kubevirt.io No resources found + _kubectl -n default delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete rs -l kubevirt.io No resources found + _kubectl -n default delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete services -l kubevirt.io No resources found + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n default delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete secrets -l kubevirt.io No resources found + _kubectl -n default delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete pv -l kubevirt.io No resources found + _kubectl -n default delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete pvc -l kubevirt.io No resources found + _kubectl -n default delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete ds -l kubevirt.io No resources found + _kubectl -n default delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n default delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete pods -l kubevirt.io No resources found + _kubectl -n default delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n default delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete rolebinding -l kubevirt.io No resources found + _kubectl -n default delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete roles -l kubevirt.io No resources found + _kubectl -n default delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete clusterroles -l kubevirt.io No resources found + _kubectl -n default delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n default get crd offlinevirtualmachines.kubevirt.io ++ export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig ++ cluster/k8s-1.10.4/.kubectl -n default get crd offlinevirtualmachines.kubevirt.io ++ wc -l Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + for i in '${namespaces[@]}' + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete deployment -l kubevirt.io No resources found + _kubectl -n kube-system delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete rs -l kubevirt.io No resources found + _kubectl -n kube-system delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete services -l kubevirt.io No resources found + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n kube-system delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete secrets -l kubevirt.io No resources found + _kubectl -n kube-system delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete pv -l kubevirt.io No resources found + _kubectl -n kube-system delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete pvc -l kubevirt.io No resources found + _kubectl -n kube-system delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete ds -l kubevirt.io No resources found + _kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n kube-system delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete pods -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete rolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete roles -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete clusterroles -l kubevirt.io No resources found + _kubectl -n kube-system delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io ++ export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig ++ wc -l ++ KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig ++ cluster/k8s-1.10.4/.kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + sleep 2 + echo Done Done ./cluster/deploy.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.10.4 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.10.4-release ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.10.4-release0 ++ job_prefix=kubevirt-functional-tests-k8s-1.10.4-release0 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-174-gc58ac2f ++ KUBEVIRT_VERSION=v0.7.0-174-gc58ac2f + source cluster/k8s-1.10.4/provider.sh ++ set -e ++ image=k8s-1.10.4@sha256:09ac918cc16f13a5d0af51d4c98e3e25cbf4f97b7b32fe18ec61b32f04ca1009 ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.10.4 ++ source hack/config-default.sh source hack/config-k8s-1.10.4.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.10.4.sh ++ source hack/config-provider-k8s-1.10.4.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.4/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.4/.kubectl +++ docker_prefix=localhost:33278/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Deploying ...' Deploying ... + [[ -z k8s-1.10.4-release ]] + [[ k8s-1.10.4-release =~ .*-dev ]] + [[ k8s-1.10.4-release =~ .*-release ]] + for manifest in '${MANIFESTS_OUT_DIR}/release/*' + [[ /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/demo-content.yaml =~ .*demo.* ]] + continue + for manifest in '${MANIFESTS_OUT_DIR}/release/*' + [[ /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml =~ .*demo.* ]] + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml clusterrole.rbac.authorization.k8s.io "kubevirt.io:admin" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:edit" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:view" created serviceaccount "kubevirt-apiserver" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-apiserver-auth-delegator" created rolebinding.rbac.authorization.k8s.io "kubevirt-apiserver" created role.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrole.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrole.rbac.authorization.k8s.io "kubevirt-controller" created serviceaccount "kubevirt-controller" created serviceaccount "kubevirt-privileged" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-controller" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-controller-cluster-admin" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-privileged-cluster-admin" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:default" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt.io:default" created service "virt-api" created deployment.extensions "virt-api" created deployment.extensions "virt-controller" created daemonset.extensions "virt-handler" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstances.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstancereplicasets.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstancepresets.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachines.kubevirt.io" created + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R persistentvolumeclaim "disk-alpine" created persistentvolume "host-path-disk-alpine" created persistentvolumeclaim "disk-custom" created persistentvolume "host-path-disk-custom" created daemonset.extensions "disks-images-provider" created serviceaccount "kubevirt-testing" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-testing-cluster-admin" created + [[ k8s-1.10.4 =~ os-* ]] + echo Done Done + namespaces=(kube-system default) + [[ kube-system != \k\u\b\e\-\s\y\s\t\e\m ]] + timeout=300 + sample=30 + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n kube-system --no-headers ++ cluster/kubectl.sh get pods -n kube-system --no-headers ++ grep -v Running + '[' -n 'virt-api-7d79764579-27kb9 0/1 ContainerCreating 0 3s virt-api-7d79764579-xrlr5 0/1 ContainerCreating 0 3s virt-controller-7d57d96b65-fmt47 0/1 ContainerCreating 0 3s virt-controller-7d57d96b65-lwrk6 0/1 ContainerCreating 0 3s virt-handler-gr5ql 0/1 ContainerCreating 0 2s virt-handler-s6p2b 0/1 ContainerCreating 0 3s' ']' + echo 'Waiting for kubevirt pods to enter the Running state ...' Waiting for kubevirt pods to enter the Running state ... + kubectl get pods -n kube-system --no-headers + cluster/kubectl.sh get pods -n kube-system --no-headers + grep -v Running disks-images-provider-hppz6 0/1 ContainerCreating 0 1s disks-images-provider-jnzcs 0/1 ContainerCreating 0 1s virt-api-7d79764579-27kb9 0/1 ContainerCreating 0 4s virt-api-7d79764579-xrlr5 0/1 ContainerCreating 0 4s virt-controller-7d57d96b65-fmt47 0/1 ContainerCreating 0 4s virt-controller-7d57d96b65-lwrk6 0/1 ContainerCreating 0 4s virt-handler-gr5ql 0/1 ContainerCreating 0 3s virt-handler-s6p2b 0/1 ContainerCreating 0 4s + sleep 30 + current_time=30 + '[' 30 -gt 300 ']' ++ kubectl get pods -n kube-system --no-headers ++ cluster/kubectl.sh get pods -n kube-system --no-headers ++ grep -v Running + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false + '[' -n '' ']' + kubectl get pods -n kube-system + cluster/kubectl.sh get pods -n kube-system NAME READY STATUS RESTARTS AGE disks-images-provider-hppz6 1/1 Running 0 36s disks-images-provider-jnzcs 1/1 Running 0 36s etcd-node01 1/1 Running 0 13m kube-apiserver-node01 1/1 Running 0 13m kube-controller-manager-node01 1/1 Running 0 13m kube-dns-86f4d74b45-fn68h 3/3 Running 0 14m kube-flannel-ds-hmrxh 1/1 Running 0 14m kube-flannel-ds-m99bp 1/1 Running 0 14m kube-proxy-98tfb 1/1 Running 0 14m kube-proxy-rk792 1/1 Running 0 14m kube-scheduler-node01 1/1 Running 0 13m virt-api-7d79764579-27kb9 1/1 Running 0 39s virt-api-7d79764579-xrlr5 1/1 Running 1 39s virt-controller-7d57d96b65-fmt47 1/1 Running 0 39s virt-controller-7d57d96b65-lwrk6 1/1 Running 0 39s virt-handler-gr5ql 1/1 Running 0 38s virt-handler-s6p2b 1/1 Running 0 39s + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n default --no-headers ++ cluster/kubectl.sh get pods -n default --no-headers ++ grep -v Running No resources found. + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n '' ']' + kubectl get pods -n default + cluster/kubectl.sh get pods -n default No resources found. + kubectl version + cluster/kubectl.sh version Client Version: version.Info{Major:"1", Minor:"10", GitVersion:"v1.10.4", GitCommit:"5ca598b4ba5abb89bb773071ce452e33fb66339d", GitTreeState:"clean", BuildDate:"2018-06-06T08:13:03Z", GoVersion:"go1.9.3", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"10", GitVersion:"v1.10.4", GitCommit:"5ca598b4ba5abb89bb773071ce452e33fb66339d", GitTreeState:"clean", BuildDate:"2018-06-06T08:00:59Z", GoVersion:"go1.9.3", Compiler:"gc", Platform:"linux/amd64"} + ginko_params='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/junit.xml' + [[ k8s-1.10.4-release =~ windows.* ]] + FUNC_TEST_ARGS='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/junit.xml' + make functest hack/dockerized "hack/build-func-tests.sh" sha256:dcf2b21fa2ed11dcf9dbba21b1cca0ee3fad521a0e9aee61c06d0b0b66a4b200 go version go1.10 linux/amd64 go version go1.10 linux/amd64 Compiling tests... compiled tests.test hack/functests.sh Running Suite: Tests Suite ========================== Random Seed: 1533045632 Will run 151 of 151 specs • [SLOW TEST:55.079 seconds] CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 with cloudInitNoCloud userDataBase64 source /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:81 should have cloud-init data /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:82 ------------------------------ 2018/07/31 10:01:29 read closing down: EOF • [SLOW TEST:117.943 seconds] 2018/07/31 10:03:27 read closing down: EOF CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 with cloudInitNoCloud userDataBase64 source /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:81 with injected ssh-key /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:92 should have ssh-key under authorized keys /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:93 ------------------------------ 2018/07/31 10:04:17 read closing down: EOF 2018/07/31 10:04:28 read closing down: EOF • [SLOW TEST:61.141 seconds] CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 with cloudInitNoCloud userData source /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:118 should process provided cloud-init data /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:119 ------------------------------ 2018/07/31 10:05:14 read closing down: EOF • [SLOW TEST:46.647 seconds] CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 should take user-data from k8s secret /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:162 ------------------------------ • [SLOW TEST:19.157 seconds] HookSidecars /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:40 VMI definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:58 with SM BIOS hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:59 should successfully start with hook sidecar annotation /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:60 ------------------------------ • [SLOW TEST:19.754 seconds] HookSidecars /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:40 VMI definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:58 with SM BIOS hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:59 should call Collect and OnDefineDomain on the hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:67 ------------------------------ • [SLOW TEST:22.209 seconds] HookSidecars /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:40 VMI definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:58 with SM BIOS hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:59 should update domain XML with SM BIOS properties /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:83 ------------------------------ •2018/07/31 10:06:53 read closing down: EOF ------------------------------ • [SLOW TEST:36.770 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 with Alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:71 should be successfully started /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 with Disk PVC /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:34.294 seconds] 2018/07/31 10:07:27 read closing down: EOF Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 with Alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:71 should be successfully started /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 with CDRom PVC /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ 2018/07/31 10:10:35 read closing down: EOF • [SLOW TEST:241.389 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 with Alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:71 should be successfully started and stopped multiple times /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 with Disk PVC /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ 2018/07/31 10:14:17 read closing down: EOF • [SLOW TEST:225.339 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 with Alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:71 should be successfully started and stopped multiple times /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 with CDRom PVC /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:53.935 seconds] 2018/07/31 10:16:08 read closing down: EOF Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With an emptyDisk defined /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:113 should create a writeable emptyDisk with the right capacity /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:115 ------------------------------ • [SLOW TEST:50.194 seconds] Storage 2018/07/31 10:16:58 read closing down: EOF /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With an emptyDisk defined and a specified serial number /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:163 should create a writeable emptyDisk with the specified serial number /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:165 ------------------------------ 2018/07/31 10:17:32 read closing down: EOF • [SLOW TEST:34.181 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With ephemeral alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:205 should be successfully started /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:207 ------------------------------ 2018/07/31 10:20:51 read closing down: EOF Get https://127.0.0.1:33276/api/v1/namespaces/kube-system/pods?labelSelector=kubevirt.io: unexpected EOF • Failure [198.659 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With ephemeral alpine PVC /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:205 should not persist data [It] /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:218 Expected error: : 180000000000 expect: timer expired after 180 seconds not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:226 ------------------------------ STEP: Starting the VirtualMachineInstance STEP: Starting a VirtualMachineInstance STEP: Waiting until the VirtualMachineInstance will start level=info timestamp=2018-07-31T14:17:32.908650Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmigzfvb kind=VirtualMachineInstance uid=76fbadcb-94cc-11e8-858f-525500d15501 msg="Created virtual machine pod virt-launcher-testvmigzfvb-dpbj2" level=info timestamp=2018-07-31T14:17:49.101858Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmigzfvb kind=VirtualMachineInstance uid=76fbadcb-94cc-11e8-858f-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmigzfvb-dpbj2" level=info timestamp=2018-07-31T14:17:50.927673Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmigzfvb kind=VirtualMachineInstance uid=76fbadcb-94cc-11e8-858f-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-31T14:17:50.966001Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmigzfvb kind=VirtualMachineInstance uid=76fbadcb-94cc-11e8-858f-525500d15501 msg="VirtualMachineInstance started." STEP: Writing an arbitrary file to it's EFI partition level=info timestamp=2018-07-31T14:20:51.103508Z pos=utils.go:1291 component=tests namespace=kubevirt-test-default name=testvmigzfvb kind=VirtualMachineInstance uid= msg="Login: [{2 \r\n\r\n\r\nISOLINUX 6.04 6.04-pre1 Copyright (C) 1994-2015 H. Peter Anvin et al\r\nboot: \u001b[?7h\r\n []}]" 2018/07/31 10:24:29 read closing down: EOF • [SLOW TEST:218.248 seconds] Storage /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:46 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:70 With VirtualMachineInstance with two PVCs /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:266 should start vmi multiple times /root/go/src/kubevirt.io/kubevirt/tests/storage_test.go:278 ------------------------------ • ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.002 seconds] Templates /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:42 Launching VMI from VM Template [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:60 with given Fedora Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:193 should succeed to generate a VM JSON file using oc-process command /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:150 Skip test that requires oc binary /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1422 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.002 seconds] Templates /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:42 Launching VMI from VM Template [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:60 with given Fedora Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:193 with given VM JSON from the Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:152 should succeed to create a VM using oc-create command /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:156 Skip test that requires oc binary /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1422 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.004 seconds] Templates /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:42 Launching VMI from VM Template [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:60 with given Fedora Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:193 with given VM JSON from the Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:152 with given VM from the VM JSON /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:158 should succeed to launch a VMI using oc-patch command /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:161 Skip test that requires oc binary /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1422 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.003 seconds] Templates /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:42 Launching VMI from VM Template [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:60 with given Fedora Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:193 with given VM JSON from the Template /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:152 with given VM from the VM JSON /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:158 with given VMI from the VM /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:163 should succeed to terminate the VMI using oc-patch command /root/go/src/kubevirt.io/kubevirt/tests/template_test.go:166 Skip test that requires oc binary /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1422 ------------------------------ •• ------------------------------ • [SLOW TEST:19.449 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should update VirtualMachine once VMIs are up /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:195 ------------------------------ • [SLOW TEST:8.741 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should remove VirtualMachineInstance once the VMI is marked for deletion /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:204 ------------------------------ • ------------------------------ • [SLOW TEST:89.108 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should recreate VirtualMachineInstance if it gets deleted /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:245 ------------------------------ • [SLOW TEST:116.158 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should recreate VirtualMachineInstance if the VirtualMachineInstance's pod gets deleted /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:265 ------------------------------ • [SLOW TEST:100.902 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should stop VirtualMachineInstance if running set to false /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:325 ------------------------------ • [SLOW TEST:432.666 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should start and stop VirtualMachineInstance multiple times /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:333 ------------------------------ • [SLOW TEST:126.493 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should not update the VirtualMachineInstance spec if Running /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:346 ------------------------------ Pod name: disks-images-provider-hppz6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jnzcs Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-27kb9 Pod phase: Running 2018/07/31 14:43:13 http: TLS handshake error from 10.244.1.1:45816: EOF level=info timestamp=2018-07-31T14:43:22.481282Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T14:43:22.523109Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/31 14:43:23 http: TLS handshake error from 10.244.1.1:45822: EOF 2018/07/31 14:43:33 http: TLS handshake error from 10.244.1.1:45828: EOF level=info timestamp=2018-07-31T14:43:41.273276Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 14:43:43 http: TLS handshake error from 10.244.1.1:45834: EOF level=info timestamp=2018-07-31T14:43:48.975268Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-31T14:43:48.978975Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-31T14:43:52.661920Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T14:43:52.670661Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/31 14:43:53 http: TLS handshake error from 10.244.1.1:45840: EOF 2018/07/31 14:44:03 http: TLS handshake error from 10.244.1.1:45846: EOF level=info timestamp=2018-07-31T14:44:11.171912Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 14:44:13 http: TLS handshake error from 10.244.1.1:45852: EOF Pod name: virt-api-7d79764579-xrlr5 Pod phase: Running 2018/07/31 14:41:48 http: TLS handshake error from 10.244.0.1:59088: EOF 2018/07/31 14:41:58 http: TLS handshake error from 10.244.0.1:59112: EOF 2018/07/31 14:42:08 http: TLS handshake error from 10.244.0.1:59136: EOF 2018/07/31 14:42:18 http: TLS handshake error from 10.244.0.1:59160: EOF 2018/07/31 14:42:28 http: TLS handshake error from 10.244.0.1:59184: EOF 2018/07/31 14:42:38 http: TLS handshake error from 10.244.0.1:59208: EOF 2018/07/31 14:42:48 http: TLS handshake error from 10.244.0.1:59232: EOF 2018/07/31 14:42:58 http: TLS handshake error from 10.244.0.1:59256: EOF 2018/07/31 14:43:08 http: TLS handshake error from 10.244.0.1:59280: EOF 2018/07/31 14:43:18 http: TLS handshake error from 10.244.0.1:59304: EOF 2018/07/31 14:43:28 http: TLS handshake error from 10.244.0.1:59328: EOF 2018/07/31 14:43:38 http: TLS handshake error from 10.244.0.1:59352: EOF 2018/07/31 14:43:48 http: TLS handshake error from 10.244.0.1:59376: EOF 2018/07/31 14:43:58 http: TLS handshake error from 10.244.0.1:59400: EOF 2018/07/31 14:44:08 http: TLS handshake error from 10.244.0.1:59424: EOF Pod name: virt-controller-7d57d96b65-fmt47 Pod phase: Running level=info timestamp=2018-07-31T14:19:03.966978Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-lwrk6 Pod phase: Running level=info timestamp=2018-07-31T14:39:26.696883Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5wdjh kind= uid=8608737a-94cf-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:39:26.703136Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5wdjh kind= uid=8608737a-94cf-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:39:26.703358Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5wdjh kind= uid=8608737a-94cf-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:39:26.763815Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5wdjh kind= uid=8608737a-94cf-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:39:26.764011Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5wdjh kind= uid=8608737a-94cf-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:39:26.777410Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi5wdjh\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi5wdjh" level=info timestamp=2018-07-31T14:39:26.804457Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5wdjh kind= uid=8608737a-94cf-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:39:26.804678Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5wdjh kind= uid=8608737a-94cf-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:39:26.839359Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi5wdjh\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi5wdjh" level=info timestamp=2018-07-31T14:39:43.888856Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5wdjh kind= uid=8608737a-94cf-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:39:43.891757Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5wdjh kind= uid=8608737a-94cf-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:39:45.305543Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5wdjh kind= uid=8608737a-94cf-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:39:45.305751Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5wdjh kind= uid=8608737a-94cf-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:39:45.329072Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5wdjh kind= uid=8608737a-94cf-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:39:45.329174Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5wdjh kind= uid=8608737a-94cf-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-handler-gr5ql Pod phase: Running level=error timestamp=2018-07-31T14:40:17.133496Z pos=vm.go:424 component=virt-handler namespace=kubevirt-test-default name=testvmirgjtx kind=VirtualMachineInstance uid= reason="connection is shut down" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-07-31T14:40:17.133805Z pos=vm.go:251 component=virt-handler reason="connection is shut down" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmirgjtx" level=info timestamp=2018-07-31T14:40:19.285079Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmirgjtx kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-31T14:40:19.285539Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmirgjtx, existing: false\n" level=info timestamp=2018-07-31T14:40:19.285616Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T14:40:19.285779Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirgjtx kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T14:40:19.287880Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmirgjtx kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:40:19.288473Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmirgjtx, existing: false\n" level=info timestamp=2018-07-31T14:40:19.288734Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T14:40:19.288957Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirgjtx kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T14:40:19.289159Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmirgjtx kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:40:37.615958Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmirgjtx, existing: false\n" level=info timestamp=2018-07-31T14:40:37.616272Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T14:40:37.616513Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirgjtx kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T14:40:37.616780Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmirgjtx kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-s6p2b Pod phase: Running level=info timestamp=2018-07-31T14:26:14.091263Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmisks6d kind= uid=824400b7-94cd-11e8-bcd5-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T14:26:14.091475Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmisks6d kind= uid=824400b7-94cd-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:26:14.102465Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmisks6d, existing: true\n" level=info timestamp=2018-07-31T14:26:14.102617Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-07-31T14:26:14.102953Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T14:26:14.104013Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmisks6d kind= uid=824400b7-94cd-11e8-bcd5-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T14:26:14.104463Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmisks6d kind= uid=824400b7-94cd-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:26:29.254463Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmisks6d, existing: false\n" level=info timestamp=2018-07-31T14:26:29.273498Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T14:26:29.276149Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmisks6d kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T14:26:29.277349Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmisks6d kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:26:31.883101Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmisks6d, existing: false\n" level=info timestamp=2018-07-31T14:26:31.886892Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T14:26:31.887298Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmisks6d kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T14:26:31.887492Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmisks6d kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi5wdjh-ml6w8 Pod phase: Running level=info timestamp=2018-07-31T14:39:44.167650Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:39:44.654860Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:39:44.664354Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:39:44.684650Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 04cbb334-b565-55a2-b0e3-a799404d99d3" level=info timestamp=2018-07-31T14:39:44.686206Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:39:45.215666Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:39:45.229449Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:39:45.232065Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:39:45.258941Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:39:45.268527Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi5wdjh kind= uid=861184f8-94cf-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:39:45.271540Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi5wdjh kind= uid=861184f8-94cf-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:39:45.272868Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:39:45.276771Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:39:45.315550Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi5wdjh kind= uid=861184f8-94cf-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:39:45.696604Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 04cbb334-b565-55a2-b0e3-a799404d99d3: 164" 2018/07/31 10:44:16 read closing down: EOF • Failure [290.761 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should survive guest shutdown, multiple times [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:387 Timed out after 240.000s. No new VirtualMachineInstance instance showed up Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:429 ------------------------------ STEP: Creating new VMI, not running STEP: Starting the VirtualMachineInstance STEP: VMI has the running condition STEP: Getting the running VirtualMachineInstance STEP: Obtaining the serial console STEP: Guest shutdown STEP: waiting for the controller to replace the shut-down vmi with a new instance VM testvmicjmwc was scheduled to start • [SLOW TEST:18.720 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 Using virtctl interface /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:435 should start a VirtualMachineInstance once /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:436 ------------------------------ VM testvmistwdg was scheduled to stop • [SLOW TEST:113.986 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 Using virtctl interface /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:435 should stop a VirtualMachineInstance once /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:467 ------------------------------ • [SLOW TEST:14.138 seconds] User Access /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:33 With default kubevirt service accounts /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:41 should verify permissions are correct for view, edit, and admin /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 given a vmi /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:13.783 seconds] User Access /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:33 With default kubevirt service accounts /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:41 should verify permissions are correct for view, edit, and admin /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 given an vm /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:14.035 seconds] User Access /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:33 With default kubevirt service accounts /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:41 should verify permissions are correct for view, edit, and admin /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 given a vmi preset /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:14.124 seconds] User Access /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:33 With default kubevirt service accounts /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:41 should verify permissions are correct for view, edit, and admin /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 given a vmi replica set /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:38.327 seconds] LeaderElection /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:43 Start a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:53 when the controller pod is not running /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:54 should success /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:55 ------------------------------ • [SLOW TEST:135.229 seconds] RegistryDisk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:41 Starting and stopping the same VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:90 with ephemeral registry disk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:91 should success multiple times /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:92 ------------------------------ • [SLOW TEST:18.280 seconds] RegistryDisk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:41 Starting a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:111 with ephemeral registry disk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:112 should not modify the spec on status update /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:113 ------------------------------ • [SLOW TEST:29.105 seconds] RegistryDisk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:41 Starting multiple VMIs /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:129 with ephemeral registry disk /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:130 should success /root/go/src/kubevirt.io/kubevirt/tests/registry_disk_test.go:131 ------------------------------ 2018/07/31 10:51:56 read closing down: EOF Service cluster-ip-vmi successfully exposed for virtualmachineinstance testvmidhbnn • [SLOW TEST:53.393 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on a VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:61 Expose ClusterIP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:68 Should expose a Cluster IP service on a VMI and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:71 ------------------------------ Service cluster-ip-target-vmi successfully exposed for virtualmachineinstance testvmidhbnn •Service node-port-vmi successfully exposed for virtualmachineinstance testvmidhbnn ------------------------------ • [SLOW TEST:8.311 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on a VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:61 Expose NodePort service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:124 Should expose a NodePort service on a VMI and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:129 ------------------------------ 2018/07/31 10:52:54 read closing down: EOF Service cluster-ip-udp-vmi successfully exposed for virtualmachineinstance testvmi5wf8j • [SLOW TEST:50.937 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose UDP service on a VMI /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:166 Expose ClusterIP UDP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:173 Should expose a ClusterIP service on a VMI and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:177 ------------------------------ Service node-port-udp-vmi successfully exposed for virtualmachineinstance testvmi5wf8j Pod name: disks-images-provider-hppz6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jnzcs Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-27kb9 Pod phase: Running level=info timestamp=2018-07-31T14:52:55.875526Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/31 14:53:03 http: TLS handshake error from 10.244.1.1:46166: EOF level=info timestamp=2018-07-31T14:53:11.173565Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 14:53:13 http: TLS handshake error from 10.244.1.1:46172: EOF 2018/07/31 14:53:24 http: TLS handshake error from 10.244.1.1:46178: EOF level=info timestamp=2018-07-31T14:53:26.026157Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T14:53:26.063508Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/31 14:53:33 http: TLS handshake error from 10.244.1.1:46184: EOF level=info timestamp=2018-07-31T14:53:41.305258Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 14:53:44 http: TLS handshake error from 10.244.1.1:46190: EOF level=info timestamp=2018-07-31T14:53:49.118990Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-31T14:53:49.151466Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 14:53:54 http: TLS handshake error from 10.244.1.1:46196: EOF level=info timestamp=2018-07-31T14:53:56.159909Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T14:53:56.263292Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-xrlr5 Pod phase: Running 2018/07/31 14:51:38 http: TLS handshake error from 10.244.0.1:60516: EOF 2018/07/31 14:51:48 http: TLS handshake error from 10.244.0.1:60540: EOF 2018/07/31 14:51:58 http: TLS handshake error from 10.244.0.1:60564: EOF 2018/07/31 14:52:08 http: TLS handshake error from 10.244.0.1:60594: EOF 2018/07/31 14:52:18 http: TLS handshake error from 10.244.0.1:60618: EOF 2018/07/31 14:52:28 http: TLS handshake error from 10.244.0.1:60648: EOF 2018/07/31 14:52:38 http: TLS handshake error from 10.244.0.1:60672: EOF 2018/07/31 14:52:48 http: TLS handshake error from 10.244.0.1:60696: EOF 2018/07/31 14:52:58 http: TLS handshake error from 10.244.0.1:60720: EOF 2018/07/31 14:53:08 http: TLS handshake error from 10.244.0.1:60744: EOF 2018/07/31 14:53:18 http: TLS handshake error from 10.244.0.1:60768: EOF 2018/07/31 14:53:28 http: TLS handshake error from 10.244.0.1:60792: EOF 2018/07/31 14:53:38 http: TLS handshake error from 10.244.0.1:60816: EOF 2018/07/31 14:53:48 http: TLS handshake error from 10.244.0.1:60840: EOF 2018/07/31 14:53:58 http: TLS handshake error from 10.244.0.1:60864: EOF Pod name: virt-controller-7d57d96b65-c5mrh Pod phase: Running level=info timestamp=2018-07-31T14:47:29.666482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-fmt47 Pod phase: Running level=info timestamp=2018-07-31T14:50:37.712992Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihhjlk kind= uid=163be118-94d1-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T14:50:37.767504Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmij4kc4 kind= uid=16406992-94d1-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T14:50:37.775010Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmij4kc4 kind= uid=16406992-94d1-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T14:50:37.953851Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihjws7 kind= uid=164dbdb6-94d1-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T14:50:37.954135Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihjws7 kind= uid=164dbdb6-94d1-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T14:50:37.965231Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipn5kt kind= uid=165aab9d-94d1-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T14:50:37.965406Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipn5kt kind= uid=165aab9d-94d1-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T14:50:37.987487Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi5s9j4\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi5s9j4" level=info timestamp=2018-07-31T14:50:38.063872Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmihhjlk\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmihhjlk" level=info timestamp=2018-07-31T14:50:38.499170Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmij4kc4\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmij4kc4" level=info timestamp=2018-07-31T14:51:06.500530Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidhbnn kind= uid=2764ec71-94d1-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T14:51:06.502007Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidhbnn kind= uid=2764ec71-94d1-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T14:52:08.322178Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5wf8j kind= uid=4c3ce8a8-94d1-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T14:52:08.324341Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5wf8j kind= uid=4c3ce8a8-94d1-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T14:52:08.660525Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi5wf8j\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi5wf8j" Pod name: virt-handler-gr5ql Pod phase: Running level=info timestamp=2018-07-31T14:52:25.968189Z pos=vm.go:756 component=virt-handler namespace=kubevirt-test-default name=testvmi5wf8j kind=Domain uid=4c3ce8a8-94d1-11e8-bcd5-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-31T14:52:25.991805Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi5wf8j kind= uid=4c3ce8a8-94d1-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:52:25.992083Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi5wf8j, existing: true\n" level=info timestamp=2018-07-31T14:52:25.992140Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-07-31T14:52:25.992253Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T14:52:25.992380Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T14:52:25.992509Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi5wf8j kind= uid=4c3ce8a8-94d1-11e8-bcd5-525500d15501 msg="No update processing required" level=info timestamp=2018-07-31T14:52:26.010951Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-31T14:52:26.034294Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi5wf8j kind= uid=4c3ce8a8-94d1-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:52:26.034518Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi5wf8j, existing: true\n" level=info timestamp=2018-07-31T14:52:26.034551Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-31T14:52:26.034591Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T14:52:26.034617Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T14:52:26.034723Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmi5wf8j kind= uid=4c3ce8a8-94d1-11e8-bcd5-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-31T14:52:26.042191Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi5wf8j kind= uid=4c3ce8a8-94d1-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-s6p2b Pod phase: Running level=info timestamp=2018-07-31T14:50:55.780922Z pos=vm.go:756 component=virt-handler namespace=kubevirt-test-default name=testvmij4kc4 kind=Domain uid=16406992-94d1-11e8-bcd5-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-31T14:50:55.786517Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmij4kc4 kind= uid=16406992-94d1-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:50:55.786602Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmij4kc4, existing: true\n" level=info timestamp=2018-07-31T14:50:55.786624Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-07-31T14:50:55.786657Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T14:50:55.797937Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T14:50:55.798096Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmij4kc4 kind= uid=16406992-94d1-11e8-bcd5-525500d15501 msg="No update processing required" level=info timestamp=2018-07-31T14:50:55.807527Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-31T14:50:55.822295Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmij4kc4 kind= uid=16406992-94d1-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:50:55.822379Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmij4kc4, existing: true\n" level=info timestamp=2018-07-31T14:50:55.822400Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-31T14:50:55.822425Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T14:50:55.822442Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T14:50:55.822512Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmij4kc4 kind= uid=16406992-94d1-11e8-bcd5-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-31T14:50:55.853042Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmij4kc4 kind= uid=16406992-94d1-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." Pod name: netcat9579t Pod phase: Succeeded ++ head -n 1 +++ nc 192.168.66.101 30017 -i 1 -w 1 Hello World! succeeded + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Pod name: netcathw855 Pod phase: Succeeded ++ head -n 1 +++ nc 192.168.66.102 30017 -i 1 -w 1 Hello World! succeeded + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Pod name: netcatn4p6m Pod phase: Succeeded ++ head -n 1 +++ echo +++ nc -ul 29016 +++ nc -up 29016 10.109.149.60 29017 -i 1 -w 1 + x='Hello UDP World!' + echo 'Hello UDP World!' + '[' 'Hello UDP World!' = 'Hello UDP World!' ']' + echo succeeded + exit 0 Hello UDP World! succeeded Pod name: netcatp45v5 Pod phase: Succeeded ++ head -n 1 +++ nc 10.100.27.166 27017 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatz77rz Pod phase: Succeeded ++ head -n 1 +++ echo +++ nc -ul 28016 +++ nc -up 28016 10.111.226.105 28017 -i 1 -w 1 + x='Hello UDP World!' + echo 'Hello UDP World!' + '[' 'Hello UDP World!' = 'Hello UDP World!' ']' + echo succeeded + exit 0 Hello UDP World! succeeded Pod name: netcatzzbl5 Pod phase: Running ++ head -n 1 +++ nc -up 31016 192.168.66.101 31017 -i 1 -w 1 +++ nc -ul 31016 +++ echo Pod name: virt-launcher-testvmi5s9j4-dwb26 Pod phase: Running level=info timestamp=2018-07-31T14:51:01.818046Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.107995Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:51:03.115100Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:03.129268Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 74e95e32-c605-4e5b-ab2e-535e2dc5469f" level=info timestamp=2018-07-31T14:51:03.129519Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:51:03.478545Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.548433Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:03.554041Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:03.575687Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.637909Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi5s9j4 kind= uid=1637c221-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:51:03.640781Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi5s9j4 kind= uid=1637c221-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:03.641678Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:03.645928Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:04.170148Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 74e95e32-c605-4e5b-ab2e-535e2dc5469f: 173" level=info timestamp=2018-07-31T14:51:05.585956Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi5s9j4 kind= uid=1637c221-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmi5wf8j-47t7n Pod phase: Running level=info timestamp=2018-07-31T14:52:24.873896Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:52:25.593002Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:52:25.599178Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:52:25.610491Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 31f24abc-f0b2-4313-bd48-8f2624686dea" level=info timestamp=2018-07-31T14:52:25.611001Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:52:25.897454Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:52:25.957690Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:52:25.959588Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi5wf8j kind= uid=4c3ce8a8-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:52:25.969425Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:52:25.969667Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:52:25.973343Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi5wf8j kind= uid=4c3ce8a8-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:52:26.008558Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:52:26.011850Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:52:26.039874Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi5wf8j kind= uid=4c3ce8a8-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:52:26.618540Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 31f24abc-f0b2-4313-bd48-8f2624686dea: 164" Pod name: virt-launcher-testvmidhbnn-p8tqq Pod phase: Running level=info timestamp=2018-07-31T14:51:25.558976Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:51:25.570314Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:25.586591Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 19c3ec91-c8dd-432d-8729-e75751fe0c35" level=info timestamp=2018-07-31T14:51:25.587165Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:51:25.852078Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:51:25.895108Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:25.909495Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:25.931430Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:51:25.944027Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmidhbnn kind= uid=2764ec71-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:51:25.947038Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmidhbnn kind= uid=2764ec71-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:25.982096Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:25.990746Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:26.004004Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmidhbnn kind= uid=2764ec71-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:26.016876Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmidhbnn kind= uid=2764ec71-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:26.595202Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 19c3ec91-c8dd-432d-8729-e75751fe0c35: 164" Pod name: virt-launcher-testvmihhjlk-jrbgz Pod phase: Running level=info timestamp=2018-07-31T14:51:03.942077Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:51:04.890679Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:51:04.908465Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:04.939768Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 48827b14-3dff-46cf-b658-7979efd5c7a4" level=info timestamp=2018-07-31T14:51:04.940166Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:51:05.653439Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:51:05.731983Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:05.738813Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:05.757273Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:51:05.797633Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmihhjlk kind= uid=163be118-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:51:05.812692Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmihhjlk kind= uid=163be118-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:05.822585Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:05.842200Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:05.892609Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmihhjlk kind= uid=163be118-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:05.989760Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 48827b14-3dff-46cf-b658-7979efd5c7a4: 172" Pod name: virt-launcher-testvmihjws7-mzqqw Pod phase: Running level=info timestamp=2018-07-31T14:51:03.760254Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:51:04.779934Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:51:04.803402Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:04.814361Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 0f5ed251-287e-45b0-a389-5134e64b7531" level=info timestamp=2018-07-31T14:51:04.814892Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:51:05.289719Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:51:05.323360Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:05.327671Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:05.379025Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:51:05.408457Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmihjws7 kind= uid=164dbdb6-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:51:05.422509Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmihjws7 kind= uid=164dbdb6-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:05.423975Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:05.429970Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:05.601066Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmihjws7 kind= uid=164dbdb6-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:05.898541Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 0f5ed251-287e-45b0-a389-5134e64b7531: 175" Pod name: virt-launcher-testvmij4kc4-84hsl Pod phase: Running level=info timestamp=2018-07-31T14:50:54.538199Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:50:55.482949Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:50:55.490061Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:50:55.533023Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 17af30fd-508b-4af4-aa9a-109c7f74f9d0" level=info timestamp=2018-07-31T14:50:55.535858Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:50:55.731572Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:50:55.773478Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:50:55.773862Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmij4kc4 kind= uid=16406992-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:50:55.777880Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmij4kc4 kind= uid=16406992-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:50:55.784117Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:50:55.784326Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:50:55.806171Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:50:55.808161Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:50:55.829940Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmij4kc4 kind= uid=16406992-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:50:56.555125Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 17af30fd-508b-4af4-aa9a-109c7f74f9d0: 159" Pod name: virt-launcher-testvmipn5kt-w8jz9 Pod phase: Running level=info timestamp=2018-07-31T14:51:01.987867Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.087187Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:51:03.098265Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:03.106106Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID e9e1e65c-43ce-4142-8bc7-95f3e591582a" level=info timestamp=2018-07-31T14:51:03.106678Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:51:03.395245Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.485845Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:03.488430Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmipn5kt kind= uid=165aab9d-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:51:03.502773Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmipn5kt kind= uid=165aab9d-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:03.503846Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:03.504024Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.558162Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:03.567145Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:04.120492Z pos=monitor.go:222 component=virt-launcher msg="Found PID for e9e1e65c-43ce-4142-8bc7-95f3e591582a: 167" level=info timestamp=2018-07-31T14:51:05.563864Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmipn5kt kind= uid=165aab9d-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" • Failure [62.871 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose UDP service on a VMI /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:166 Expose NodePort UDP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:205 Should expose a NodePort service on a VMI and connect to it [It] /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:210 Timed out after 60.000s. Expected : Running to equal : Succeeded /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:247 ------------------------------ STEP: Exposing the service via virtctl command STEP: Getting back the cluster IP given for the service STEP: Starting a pod which tries to reach the VMI via ClusterIP STEP: Getting the node IP from all nodes STEP: Starting a pod which tries to reach the VMI via NodePort STEP: Waiting for the pod to report a successful connection attempt 2018/07/31 10:54:46 read closing down: EOF 2018/07/31 10:54:57 read closing down: EOF Service cluster-ip-vmirs successfully exposed for vmirs replicaset7mpzb • [SLOW TEST:60.882 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on a VMI replica set /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:253 Expose ClusterIP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:286 Should create a ClusterIP service on VMRS and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:290 ------------------------------ Service cluster-ip-vm successfully exposed for virtualmachine testvmiqf9rp VM testvmiqf9rp was scheduled to start 2018/07/31 10:55:48 read closing down: EOF • [SLOW TEST:50.375 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on an VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:318 Expose ClusterIP service /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:362 Connect to ClusterIP services that was set when VM was offline /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:363 ------------------------------ 2018/07/31 10:56:44 read closing down: EOF 2018/07/31 10:57:32 read closing down: EOF Pod name: disks-images-provider-hppz6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jnzcs Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-27kb9 Pod phase: Running level=info timestamp=2018-07-31T14:56:26.912103Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/31 14:56:33 http: TLS handshake error from 10.244.1.1:46302: EOF level=info timestamp=2018-07-31T14:56:41.112978Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 14:56:44 http: TLS handshake error from 10.244.1.1:46308: EOF level=info timestamp=2018-07-31T14:56:49.179637Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-31T14:56:49.185700Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 14:56:54 http: TLS handshake error from 10.244.1.1:46314: EOF level=info timestamp=2018-07-31T14:56:56.996743Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T14:56:56.997747Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/31 14:57:03 http: TLS handshake error from 10.244.1.1:46320: EOF level=info timestamp=2018-07-31T14:57:11.152850Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 14:57:13 http: TLS handshake error from 10.244.1.1:46326: EOF 2018/07/31 14:57:24 http: TLS handshake error from 10.244.1.1:46332: EOF level=info timestamp=2018-07-31T14:57:27.240186Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T14:57:27.242057Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-xrlr5 Pod phase: Running level=info timestamp=2018-07-31T14:55:22.176311Z pos=subresource.go:75 component=virt-api msg="Websocket connection upgraded" 2018/07/31 14:55:28 http: TLS handshake error from 10.244.0.1:32868: EOF 2018/07/31 14:55:38 http: TLS handshake error from 10.244.0.1:32892: EOF 2018/07/31 14:55:48 http: TLS handshake error from 10.244.0.1:32916: EOF 2018/07/31 14:55:58 http: TLS handshake error from 10.244.0.1:32940: EOF 2018/07/31 14:56:08 http: TLS handshake error from 10.244.0.1:32964: EOF 2018/07/31 14:56:18 http: TLS handshake error from 10.244.0.1:32994: EOF 2018/07/31 14:56:28 http: TLS handshake error from 10.244.0.1:33018: EOF 2018/07/31 14:56:38 http: TLS handshake error from 10.244.0.1:33042: EOF 2018/07/31 14:56:48 http: TLS handshake error from 10.244.0.1:33068: EOF 2018/07/31 14:56:58 http: TLS handshake error from 10.244.0.1:33092: EOF level=info timestamp=2018-07-31T14:57:03.205670Z pos=subresource.go:75 component=virt-api msg="Websocket connection upgraded" 2018/07/31 14:57:08 http: TLS handshake error from 10.244.0.1:33124: EOF 2018/07/31 14:57:18 http: TLS handshake error from 10.244.0.1:33148: EOF 2018/07/31 14:57:28 http: TLS handshake error from 10.244.0.1:33172: EOF Pod name: virt-controller-7d57d96b65-c5mrh Pod phase: Running level=info timestamp=2018-07-31T14:47:29.666482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-fmt47 Pod phase: Running level=info timestamp=2018-07-31T14:55:03.404298Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:55:03.404388Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:55:03.482306Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:55:03.482394Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:55:19.965319Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:55:19.978023Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:55:21.518982Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:55:21.519110Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:55:21.572218Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:55:21.572317Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:55:53.409538Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T14:55:53.410311Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T14:56:44.678271Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T14:56:44.684287Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T14:56:45.080222Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4qj5k\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4qj5k" Pod name: virt-handler-gr5ql Pod phase: Running level=info timestamp=2018-07-31T14:57:03.124625Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="No update processing required" level=error timestamp=2018-07-31T14:57:03.148879Z pos=vm.go:431 component=virt-handler namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4qj5k\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-07-31T14:57:03.149030Z pos=vm.go:251 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4qj5k\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4qj5k" level=info timestamp=2018-07-31T14:57:03.149135Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi4qj5k, existing: true\n" level=info timestamp=2018-07-31T14:57:03.149164Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-31T14:57:03.149241Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T14:57:03.149262Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T14:57:03.149368Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-31T14:57:03.163907Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:57:03.164000Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi4qj5k, existing: true\n" level=info timestamp=2018-07-31T14:57:03.164034Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-31T14:57:03.164083Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T14:57:03.164102Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T14:57:03.164179Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-31T14:57:03.173724Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-s6p2b Pod phase: Running level=info timestamp=2018-07-31T14:56:12.270150Z pos=vm.go:756 component=virt-handler namespace=kubevirt-test-default name=testvmirfgpm kind=Domain uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-31T14:56:12.305581Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-31T14:56:12.320553Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:56:12.339747Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmirfgpm, existing: true\n" level=info timestamp=2018-07-31T14:56:12.339884Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-07-31T14:56:12.339955Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T14:56:12.340016Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T14:56:12.340184Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="No update processing required" level=info timestamp=2018-07-31T14:56:12.392882Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:56:12.394297Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmirfgpm, existing: true\n" level=info timestamp=2018-07-31T14:56:12.394346Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-31T14:56:12.394375Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T14:56:12.394393Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T14:56:12.394500Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-31T14:56:12.401876Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." Pod name: netcat9579t Pod phase: Succeeded ++ head -n 1 +++ nc 192.168.66.101 30017 -i 1 -w 1 Hello World! succeeded + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Pod name: netcatdkpj4 Pod phase: Succeeded ++ head -n 1 +++ nc 10.96.46.205 27017 -i 1 -w 1 Hello World! succeeded + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Pod name: netcathw855 Pod phase: Succeeded ++ head -n 1 +++ nc 192.168.66.102 30017 -i 1 -w 1 Hello World! succeeded + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Pod name: netcatkg9qx Pod phase: Succeeded ++ head -n 1 +++ nc 10.98.62.24 27017 -i 1 -w 1 Hello World! succeeded + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Pod name: netcatn4p6m Pod phase: Succeeded ++ head -n 1 +++ echo +++ nc -ul 29016 +++ nc -up 29016 10.109.149.60 29017 -i 1 -w 1 + x='Hello UDP World!' + echo 'Hello UDP World!' + '[' 'Hello UDP World!' = 'Hello UDP World!' ']' + echo succeeded + exit 0 Hello UDP World! succeeded Pod name: netcatp45v5 Pod phase: Succeeded ++ head -n 1 +++ nc 10.100.27.166 27017 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatz77rz Pod phase: Succeeded ++ head -n 1 +++ echo +++ nc -ul 28016 +++ nc -up 28016 10.111.226.105 28017 -i 1 -w 1 + x='Hello UDP World!' + echo 'Hello UDP World!' + '[' 'Hello UDP World!' = 'Hello UDP World!' ']' + echo succeeded + exit 0 Hello UDP World! succeeded Pod name: netcatzzbl5 Pod phase: Running ++ head -n 1 +++ nc -up 31016 192.168.66.101 31017 -i 1 -w 1 +++ nc -ul 31016 +++ echo Pod name: virt-launcher-testvmi2m9wjb6n2b-ktkgm Pod phase: Running level=info timestamp=2018-07-31T14:54:19.756459Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:54:20.731235Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:54:20.744667Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 1f645137-2f0e-430d-8a4b-a7cedf6d77ad" level=info timestamp=2018-07-31T14:54:20.744967Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:54:20.745025Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:54:21.203978Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:54:21.252784Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:54:21.263605Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:54:21.263873Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:54:21.295110Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi2m9wjb6n2b kind= uid=9024c2ae-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:54:21.301530Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi2m9wjb6n2b kind= uid=9024c2ae-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:54:21.302960Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:54:21.307740Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:54:21.336154Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi2m9wjb6n2b kind= uid=9024c2ae-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:54:21.755827Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 1f645137-2f0e-430d-8a4b-a7cedf6d77ad: 164" Pod name: virt-launcher-testvmi2m9wjld5bq-fj4ck Pod phase: Running level=info timestamp=2018-07-31T14:54:19.666060Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:54:20.377603Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 38480970-42fa-4cba-a79a-a7777038da83" level=info timestamp=2018-07-31T14:54:20.378061Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:54:20.406606Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:54:20.465414Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:54:20.749254Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:54:20.790335Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:54:20.803023Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:54:20.816511Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:54:20.862624Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi2m9wjld5bq kind= uid=90253aaf-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:54:20.874819Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi2m9wjld5bq kind= uid=90253aaf-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:54:20.884004Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:54:20.913008Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:54:21.016804Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi2m9wjld5bq kind= uid=90253aaf-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:54:21.439234Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 38480970-42fa-4cba-a79a-a7777038da83: 164" Pod name: virt-launcher-testvmi4qj5k-xrx2n Pod phase: Running level=info timestamp=2018-07-31T14:57:03.037277Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:57:03.051635Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:57:03.064607Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:57:03.070263Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:57:03.119200Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 98897e6f-c7fe-48e3-8e18-a8e87e56a264" level=info timestamp=2018-07-31T14:57:03.119756Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:57:03.151437Z pos=converter.go:535 component=virt-launcher msg="The network interface type of default was changed to e1000 due to unsupported interface type by qemu slirp network" level=info timestamp=2018-07-31T14:57:03.151883Z pos=converter.go:751 component=virt-launcher msg="Found nameservers in /etc/resolv.conf: \n`\u0000\n" level=info timestamp=2018-07-31T14:57:03.151921Z pos=converter.go:752 component=virt-launcher msg="Found search domains in /etc/resolv.conf: kubevirt-test-default.svc.cluster.local svc.cluster.local cluster.local" level=info timestamp=2018-07-31T14:57:03.159757Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:57:03.165006Z pos=converter.go:535 component=virt-launcher msg="The network interface type of default was changed to e1000 due to unsupported interface type by qemu slirp network" level=info timestamp=2018-07-31T14:57:03.165378Z pos=converter.go:751 component=virt-launcher msg="Found nameservers in /etc/resolv.conf: \n`\u0000\n" level=info timestamp=2018-07-31T14:57:03.165408Z pos=converter.go:752 component=virt-launcher msg="Found search domains in /etc/resolv.conf: kubevirt-test-default.svc.cluster.local svc.cluster.local cluster.local" level=info timestamp=2018-07-31T14:57:03.167773Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:57:04.135220Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 98897e6f-c7fe-48e3-8e18-a8e87e56a264: 160" Pod name: virt-launcher-testvmi5s9j4-dwb26 Pod phase: Running level=info timestamp=2018-07-31T14:51:01.818046Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.107995Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:51:03.115100Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:03.129268Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 74e95e32-c605-4e5b-ab2e-535e2dc5469f" level=info timestamp=2018-07-31T14:51:03.129519Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:51:03.478545Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.548433Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:03.554041Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:03.575687Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.637909Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi5s9j4 kind= uid=1637c221-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:51:03.640781Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi5s9j4 kind= uid=1637c221-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:03.641678Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:03.645928Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:04.170148Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 74e95e32-c605-4e5b-ab2e-535e2dc5469f: 173" level=info timestamp=2018-07-31T14:51:05.585956Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi5s9j4 kind= uid=1637c221-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmi5wf8j-47t7n Pod phase: Running level=info timestamp=2018-07-31T14:52:24.873896Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:52:25.593002Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:52:25.599178Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:52:25.610491Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 31f24abc-f0b2-4313-bd48-8f2624686dea" level=info timestamp=2018-07-31T14:52:25.611001Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:52:25.897454Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:52:25.957690Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:52:25.959588Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi5wf8j kind= uid=4c3ce8a8-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:52:25.969425Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:52:25.969667Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:52:25.973343Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi5wf8j kind= uid=4c3ce8a8-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:52:26.008558Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:52:26.011850Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:52:26.039874Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi5wf8j kind= uid=4c3ce8a8-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:52:26.618540Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 31f24abc-f0b2-4313-bd48-8f2624686dea: 164" Pod name: virt-launcher-testvmidhbnn-p8tqq Pod phase: Running level=info timestamp=2018-07-31T14:51:25.558976Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:51:25.570314Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:25.586591Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 19c3ec91-c8dd-432d-8729-e75751fe0c35" level=info timestamp=2018-07-31T14:51:25.587165Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:51:25.852078Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:51:25.895108Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:25.909495Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:25.931430Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:51:25.944027Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmidhbnn kind= uid=2764ec71-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:51:25.947038Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmidhbnn kind= uid=2764ec71-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:25.982096Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:25.990746Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:26.004004Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmidhbnn kind= uid=2764ec71-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:26.016876Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmidhbnn kind= uid=2764ec71-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:26.595202Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 19c3ec91-c8dd-432d-8729-e75751fe0c35: 164" Pod name: virt-launcher-testvmihhjlk-jrbgz Pod phase: Running level=info timestamp=2018-07-31T14:51:03.942077Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:51:04.890679Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:51:04.908465Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:04.939768Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 48827b14-3dff-46cf-b658-7979efd5c7a4" level=info timestamp=2018-07-31T14:51:04.940166Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:51:05.653439Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:51:05.731983Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:05.738813Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:05.757273Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:51:05.797633Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmihhjlk kind= uid=163be118-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:51:05.812692Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmihhjlk kind= uid=163be118-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:05.822585Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:05.842200Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:05.892609Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmihhjlk kind= uid=163be118-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:05.989760Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 48827b14-3dff-46cf-b658-7979efd5c7a4: 172" Pod name: virt-launcher-testvmihjws7-mzqqw Pod phase: Running level=info timestamp=2018-07-31T14:51:03.760254Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:51:04.779934Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:51:04.803402Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:04.814361Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 0f5ed251-287e-45b0-a389-5134e64b7531" level=info timestamp=2018-07-31T14:51:04.814892Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:51:05.289719Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:51:05.323360Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:05.327671Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:05.379025Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:51:05.408457Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmihjws7 kind= uid=164dbdb6-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:51:05.422509Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmihjws7 kind= uid=164dbdb6-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:05.423975Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:05.429970Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:05.601066Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmihjws7 kind= uid=164dbdb6-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:05.898541Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 0f5ed251-287e-45b0-a389-5134e64b7531: 175" Pod name: virt-launcher-testvmij4kc4-84hsl Pod phase: Running level=info timestamp=2018-07-31T14:50:54.538199Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:50:55.482949Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:50:55.490061Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:50:55.533023Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 17af30fd-508b-4af4-aa9a-109c7f74f9d0" level=info timestamp=2018-07-31T14:50:55.535858Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:50:55.731572Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:50:55.773478Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:50:55.773862Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmij4kc4 kind= uid=16406992-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:50:55.777880Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmij4kc4 kind= uid=16406992-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:50:55.784117Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:50:55.784326Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:50:55.806171Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:50:55.808161Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:50:55.829940Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmij4kc4 kind= uid=16406992-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:50:56.555125Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 17af30fd-508b-4af4-aa9a-109c7f74f9d0: 159" Pod name: virt-launcher-testvmipn5kt-w8jz9 Pod phase: Running level=info timestamp=2018-07-31T14:51:01.987867Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.087187Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:51:03.098265Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:03.106106Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID e9e1e65c-43ce-4142-8bc7-95f3e591582a" level=info timestamp=2018-07-31T14:51:03.106678Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:51:03.395245Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.485845Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:03.488430Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmipn5kt kind= uid=165aab9d-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:51:03.502773Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmipn5kt kind= uid=165aab9d-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:03.503846Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:03.504024Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.558162Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:03.567145Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:04.120492Z pos=monitor.go:222 component=virt-launcher msg="Found PID for e9e1e65c-43ce-4142-8bc7-95f3e591582a: 167" level=info timestamp=2018-07-31T14:51:05.563864Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmipn5kt kind= uid=165aab9d-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmiqf9rp-mhkxl Pod phase: Running level=info timestamp=2018-07-31T14:55:20.251612Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:55:20.748822Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:55:20.754635Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:55:21.062156Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:55:21.170497Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:55:21.185744Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:55:21.186029Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:55:21.201051Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b48eab99-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:55:21.256624Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b48eab99-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:55:21.266220Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:55:21.270985Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 624fe7a2-3bc8-587f-86b1-87f80cefb018" level=info timestamp=2018-07-31T14:55:21.273256Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:55:21.308126Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:55:21.522177Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b48eab99-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:55:22.281341Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 624fe7a2-3bc8-587f-86b1-87f80cefb018: 163" Pod name: virt-launcher-testvmirfgpm-822m4 Pod phase: Running level=info timestamp=2018-07-31T14:56:12.029114Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:56:12.065028Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:56:12.228501Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:56:12.257890Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:56:12.264239Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:56:12.266929Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:56:12.271605Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:56:12.271865Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:56:12.299013Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:56:12.308447Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:56:12.397347Z pos=converter.go:535 component=virt-launcher msg="The network interface type of default was changed to e1000 due to unsupported interface type by qemu slirp network" level=info timestamp=2018-07-31T14:56:12.398102Z pos=converter.go:751 component=virt-launcher msg="Found nameservers in /etc/resolv.conf: \n`\u0000\n" level=info timestamp=2018-07-31T14:56:12.398156Z pos=converter.go:752 component=virt-launcher msg="Found search domains in /etc/resolv.conf: kubevirt-test-default.svc.cluster.local svc.cluster.local cluster.local" level=info timestamp=2018-07-31T14:56:12.401463Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:56:13.127670Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 3f3b6e51-0fce-4328-8e8d-6f8e807b9f60: 159" • Failure [103.869 seconds] Slirp /root/go/src/kubevirt.io/kubevirt/tests/vmi_slirp_interface_test.go:39 should be able to /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 VirtualMachineInstance with slirp interface [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Expected error: : { Err: { s: "command terminated with exit code 126", }, Code: 126, } command terminated with exit code 126 not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/vmi_slirp_interface_test.go:88 ------------------------------ level=info timestamp=2018-07-31T14:55:53.547461Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmirfgpm kind=VirtualMachineInstance uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmirfgpm-822m4" level=info timestamp=2018-07-31T14:56:11.143669Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmirfgpm kind=VirtualMachineInstance uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmirfgpm-822m4" level=info timestamp=2018-07-31T14:56:12.300607Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmirfgpm kind=VirtualMachineInstance uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-31T14:56:12.332241Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmirfgpm kind=VirtualMachineInstance uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="VirtualMachineInstance started." level=info timestamp=2018-07-31T14:56:45.029762Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmi4qj5k kind=VirtualMachineInstance uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi4qj5k-xrx2n" level=info timestamp=2018-07-31T14:57:01.630011Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmi4qj5k kind=VirtualMachineInstance uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi4qj5k-xrx2n" level=info timestamp=2018-07-31T14:57:02.714621Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmi4qj5k kind=VirtualMachineInstance uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-31T14:57:02.822011Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmi4qj5k kind=VirtualMachineInstance uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="VirtualMachineInstance started." STEP: have containerPort in the pod manifest STEP: start the virtual machine with slirp interface level=info timestamp=2018-07-31T14:57:33.061384Z pos=vmi_slirp_interface_test.go:87 component=tests msg= Pod name: disks-images-provider-hppz6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jnzcs Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-27kb9 Pod phase: Running 2018/07/31 14:56:33 http: TLS handshake error from 10.244.1.1:46302: EOF level=info timestamp=2018-07-31T14:56:41.112978Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 14:56:44 http: TLS handshake error from 10.244.1.1:46308: EOF level=info timestamp=2018-07-31T14:56:49.179637Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-31T14:56:49.185700Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 14:56:54 http: TLS handshake error from 10.244.1.1:46314: EOF level=info timestamp=2018-07-31T14:56:56.996743Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T14:56:56.997747Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/31 14:57:03 http: TLS handshake error from 10.244.1.1:46320: EOF level=info timestamp=2018-07-31T14:57:11.152850Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 14:57:13 http: TLS handshake error from 10.244.1.1:46326: EOF 2018/07/31 14:57:24 http: TLS handshake error from 10.244.1.1:46332: EOF level=info timestamp=2018-07-31T14:57:27.240186Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T14:57:27.242057Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/31 14:57:33 http: TLS handshake error from 10.244.1.1:46340: EOF Pod name: virt-api-7d79764579-xrlr5 Pod phase: Running level=info timestamp=2018-07-31T14:55:22.176311Z pos=subresource.go:75 component=virt-api msg="Websocket connection upgraded" 2018/07/31 14:55:28 http: TLS handshake error from 10.244.0.1:32868: EOF 2018/07/31 14:55:38 http: TLS handshake error from 10.244.0.1:32892: EOF 2018/07/31 14:55:48 http: TLS handshake error from 10.244.0.1:32916: EOF 2018/07/31 14:55:58 http: TLS handshake error from 10.244.0.1:32940: EOF 2018/07/31 14:56:08 http: TLS handshake error from 10.244.0.1:32964: EOF 2018/07/31 14:56:18 http: TLS handshake error from 10.244.0.1:32994: EOF 2018/07/31 14:56:28 http: TLS handshake error from 10.244.0.1:33018: EOF 2018/07/31 14:56:38 http: TLS handshake error from 10.244.0.1:33042: EOF 2018/07/31 14:56:48 http: TLS handshake error from 10.244.0.1:33068: EOF 2018/07/31 14:56:58 http: TLS handshake error from 10.244.0.1:33092: EOF level=info timestamp=2018-07-31T14:57:03.205670Z pos=subresource.go:75 component=virt-api msg="Websocket connection upgraded" 2018/07/31 14:57:08 http: TLS handshake error from 10.244.0.1:33124: EOF 2018/07/31 14:57:18 http: TLS handshake error from 10.244.0.1:33148: EOF 2018/07/31 14:57:28 http: TLS handshake error from 10.244.0.1:33172: EOF Pod name: virt-controller-7d57d96b65-c5mrh Pod phase: Running level=info timestamp=2018-07-31T14:47:29.666482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-fmt47 Pod phase: Running level=info timestamp=2018-07-31T14:55:03.404298Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:55:03.404388Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:55:03.482306Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:55:03.482394Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:55:19.965319Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:55:19.978023Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:55:21.518982Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:55:21.519110Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:55:21.572218Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:55:21.572317Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:55:53.409538Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T14:55:53.410311Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T14:56:44.678271Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T14:56:44.684287Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T14:56:45.080222Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4qj5k\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4qj5k" Pod name: virt-handler-gr5ql Pod phase: Running level=info timestamp=2018-07-31T14:57:03.124625Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="No update processing required" level=error timestamp=2018-07-31T14:57:03.148879Z pos=vm.go:431 component=virt-handler namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4qj5k\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-07-31T14:57:03.149030Z pos=vm.go:251 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4qj5k\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4qj5k" level=info timestamp=2018-07-31T14:57:03.149135Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi4qj5k, existing: true\n" level=info timestamp=2018-07-31T14:57:03.149164Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-31T14:57:03.149241Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T14:57:03.149262Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T14:57:03.149368Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-31T14:57:03.163907Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:57:03.164000Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi4qj5k, existing: true\n" level=info timestamp=2018-07-31T14:57:03.164034Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-31T14:57:03.164083Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T14:57:03.164102Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T14:57:03.164179Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-31T14:57:03.173724Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-s6p2b Pod phase: Running level=info timestamp=2018-07-31T14:56:12.270150Z pos=vm.go:756 component=virt-handler namespace=kubevirt-test-default name=testvmirfgpm kind=Domain uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-31T14:56:12.305581Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-31T14:56:12.320553Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:56:12.339747Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmirfgpm, existing: true\n" level=info timestamp=2018-07-31T14:56:12.339884Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-07-31T14:56:12.339955Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T14:56:12.340016Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T14:56:12.340184Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="No update processing required" level=info timestamp=2018-07-31T14:56:12.392882Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:56:12.394297Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmirfgpm, existing: true\n" level=info timestamp=2018-07-31T14:56:12.394346Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-31T14:56:12.394375Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T14:56:12.394393Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T14:56:12.394500Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-31T14:56:12.401876Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." Pod name: netcat9579t Pod phase: Succeeded ++ head -n 1 +++ nc 192.168.66.101 30017 -i 1 -w 1 Hello World! succeeded + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Pod name: netcatdkpj4 Pod phase: Succeeded ++ head -n 1 +++ nc 10.96.46.205 27017 -i 1 -w 1 Hello World! succeeded + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Pod name: netcathw855 Pod phase: Succeeded ++ head -n 1 +++ nc 192.168.66.102 30017 -i 1 -w 1 Hello World! succeeded + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Pod name: netcatkg9qx Pod phase: Succeeded ++ head -n 1 +++ nc 10.98.62.24 27017 -i 1 -w 1 Hello World! succeeded + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Pod name: netcatn4p6m Pod phase: Succeeded ++ head -n 1 +++ echo +++ nc -ul 29016 +++ nc -up 29016 10.109.149.60 29017 -i 1 -w 1 + x='Hello UDP World!' + echo 'Hello UDP World!' + '[' 'Hello UDP World!' = 'Hello UDP World!' ']' + echo succeeded + exit 0 Hello UDP World! succeeded Pod name: netcatp45v5 Pod phase: Succeeded ++ head -n 1 +++ nc 10.100.27.166 27017 -i 1 -w 1 + x='Hello World!' + echo 'Hello World!' + '[' 'Hello World!' = 'Hello World!' ']' + echo succeeded + exit 0 Hello World! succeeded Pod name: netcatz77rz Pod phase: Succeeded ++ head -n 1 +++ echo +++ nc -ul 28016 +++ nc -up 28016 10.111.226.105 28017 -i 1 -w 1 + x='Hello UDP World!' + echo 'Hello UDP World!' + '[' 'Hello UDP World!' = 'Hello UDP World!' ']' + echo succeeded + exit 0 Hello UDP World! succeeded Pod name: netcatzzbl5 Pod phase: Running ++ head -n 1 +++ nc -up 31016 192.168.66.101 31017 -i 1 -w 1 +++ nc -ul 31016 +++ echo Pod name: virt-launcher-testvmi2m9wjb6n2b-ktkgm Pod phase: Running level=info timestamp=2018-07-31T14:54:19.756459Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:54:20.731235Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:54:20.744667Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 1f645137-2f0e-430d-8a4b-a7cedf6d77ad" level=info timestamp=2018-07-31T14:54:20.744967Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:54:20.745025Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:54:21.203978Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:54:21.252784Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:54:21.263605Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:54:21.263873Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:54:21.295110Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi2m9wjb6n2b kind= uid=9024c2ae-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:54:21.301530Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi2m9wjb6n2b kind= uid=9024c2ae-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:54:21.302960Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:54:21.307740Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:54:21.336154Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi2m9wjb6n2b kind= uid=9024c2ae-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:54:21.755827Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 1f645137-2f0e-430d-8a4b-a7cedf6d77ad: 164" Pod name: virt-launcher-testvmi2m9wjld5bq-fj4ck Pod phase: Running level=info timestamp=2018-07-31T14:54:19.666060Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:54:20.377603Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 38480970-42fa-4cba-a79a-a7777038da83" level=info timestamp=2018-07-31T14:54:20.378061Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:54:20.406606Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:54:20.465414Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:54:20.749254Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:54:20.790335Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:54:20.803023Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:54:20.816511Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:54:20.862624Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi2m9wjld5bq kind= uid=90253aaf-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:54:20.874819Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi2m9wjld5bq kind= uid=90253aaf-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:54:20.884004Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:54:20.913008Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:54:21.016804Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi2m9wjld5bq kind= uid=90253aaf-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:54:21.439234Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 38480970-42fa-4cba-a79a-a7777038da83: 164" Pod name: virt-launcher-testvmi4qj5k-xrx2n Pod phase: Running level=info timestamp=2018-07-31T14:57:03.037277Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:57:03.051635Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:57:03.064607Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:57:03.070263Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:57:03.119200Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 98897e6f-c7fe-48e3-8e18-a8e87e56a264" level=info timestamp=2018-07-31T14:57:03.119756Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:57:03.151437Z pos=converter.go:535 component=virt-launcher msg="The network interface type of default was changed to e1000 due to unsupported interface type by qemu slirp network" level=info timestamp=2018-07-31T14:57:03.151883Z pos=converter.go:751 component=virt-launcher msg="Found nameservers in /etc/resolv.conf: \n`\u0000\n" level=info timestamp=2018-07-31T14:57:03.151921Z pos=converter.go:752 component=virt-launcher msg="Found search domains in /etc/resolv.conf: kubevirt-test-default.svc.cluster.local svc.cluster.local cluster.local" level=info timestamp=2018-07-31T14:57:03.159757Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:57:03.165006Z pos=converter.go:535 component=virt-launcher msg="The network interface type of default was changed to e1000 due to unsupported interface type by qemu slirp network" level=info timestamp=2018-07-31T14:57:03.165378Z pos=converter.go:751 component=virt-launcher msg="Found nameservers in /etc/resolv.conf: \n`\u0000\n" level=info timestamp=2018-07-31T14:57:03.165408Z pos=converter.go:752 component=virt-launcher msg="Found search domains in /etc/resolv.conf: kubevirt-test-default.svc.cluster.local svc.cluster.local cluster.local" level=info timestamp=2018-07-31T14:57:03.167773Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:57:04.135220Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 98897e6f-c7fe-48e3-8e18-a8e87e56a264: 160" Pod name: virt-launcher-testvmi5s9j4-dwb26 Pod phase: Running level=info timestamp=2018-07-31T14:51:01.818046Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.107995Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:51:03.115100Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:03.129268Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 74e95e32-c605-4e5b-ab2e-535e2dc5469f" level=info timestamp=2018-07-31T14:51:03.129519Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:51:03.478545Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.548433Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:03.554041Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:03.575687Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.637909Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi5s9j4 kind= uid=1637c221-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:51:03.640781Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi5s9j4 kind= uid=1637c221-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:03.641678Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:03.645928Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:04.170148Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 74e95e32-c605-4e5b-ab2e-535e2dc5469f: 173" level=info timestamp=2018-07-31T14:51:05.585956Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi5s9j4 kind= uid=1637c221-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmi5wf8j-47t7n Pod phase: Running level=info timestamp=2018-07-31T14:52:24.873896Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:52:25.593002Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:52:25.599178Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:52:25.610491Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 31f24abc-f0b2-4313-bd48-8f2624686dea" level=info timestamp=2018-07-31T14:52:25.611001Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:52:25.897454Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:52:25.957690Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:52:25.959588Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi5wf8j kind= uid=4c3ce8a8-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:52:25.969425Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:52:25.969667Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:52:25.973343Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi5wf8j kind= uid=4c3ce8a8-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:52:26.008558Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:52:26.011850Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:52:26.039874Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi5wf8j kind= uid=4c3ce8a8-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:52:26.618540Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 31f24abc-f0b2-4313-bd48-8f2624686dea: 164" Pod name: virt-launcher-testvmidhbnn-p8tqq Pod phase: Running level=info timestamp=2018-07-31T14:51:25.558976Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:51:25.570314Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:25.586591Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 19c3ec91-c8dd-432d-8729-e75751fe0c35" level=info timestamp=2018-07-31T14:51:25.587165Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:51:25.852078Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:51:25.895108Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:25.909495Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:25.931430Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:51:25.944027Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmidhbnn kind= uid=2764ec71-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:51:25.947038Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmidhbnn kind= uid=2764ec71-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:25.982096Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:25.990746Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:26.004004Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmidhbnn kind= uid=2764ec71-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:26.016876Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmidhbnn kind= uid=2764ec71-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:26.595202Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 19c3ec91-c8dd-432d-8729-e75751fe0c35: 164" Pod name: virt-launcher-testvmihhjlk-jrbgz Pod phase: Running level=info timestamp=2018-07-31T14:51:03.942077Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:51:04.890679Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:51:04.908465Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:04.939768Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 48827b14-3dff-46cf-b658-7979efd5c7a4" level=info timestamp=2018-07-31T14:51:04.940166Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:51:05.653439Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:51:05.731983Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:05.738813Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:05.757273Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:51:05.797633Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmihhjlk kind= uid=163be118-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:51:05.812692Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmihhjlk kind= uid=163be118-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:05.822585Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:05.842200Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:05.892609Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmihhjlk kind= uid=163be118-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:05.989760Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 48827b14-3dff-46cf-b658-7979efd5c7a4: 172" Pod name: virt-launcher-testvmihjws7-mzqqw Pod phase: Running level=info timestamp=2018-07-31T14:51:03.760254Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:51:04.779934Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:51:04.803402Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:04.814361Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 0f5ed251-287e-45b0-a389-5134e64b7531" level=info timestamp=2018-07-31T14:51:04.814892Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:51:05.289719Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:51:05.323360Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:05.327671Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:05.379025Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:51:05.408457Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmihjws7 kind= uid=164dbdb6-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:51:05.422509Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmihjws7 kind= uid=164dbdb6-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:05.423975Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:05.429970Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:05.601066Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmihjws7 kind= uid=164dbdb6-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:05.898541Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 0f5ed251-287e-45b0-a389-5134e64b7531: 175" Pod name: virt-launcher-testvmij4kc4-84hsl Pod phase: Running level=info timestamp=2018-07-31T14:50:54.538199Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:50:55.482949Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:50:55.490061Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:50:55.533023Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 17af30fd-508b-4af4-aa9a-109c7f74f9d0" level=info timestamp=2018-07-31T14:50:55.535858Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:50:55.731572Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:50:55.773478Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:50:55.773862Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmij4kc4 kind= uid=16406992-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:50:55.777880Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmij4kc4 kind= uid=16406992-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:50:55.784117Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:50:55.784326Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:50:55.806171Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:50:55.808161Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:50:55.829940Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmij4kc4 kind= uid=16406992-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:50:56.555125Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 17af30fd-508b-4af4-aa9a-109c7f74f9d0: 159" Pod name: virt-launcher-testvmipn5kt-w8jz9 Pod phase: Running level=info timestamp=2018-07-31T14:51:01.987867Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.087187Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:51:03.098265Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:03.106106Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID e9e1e65c-43ce-4142-8bc7-95f3e591582a" level=info timestamp=2018-07-31T14:51:03.106678Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:51:03.395245Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.485845Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:03.488430Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmipn5kt kind= uid=165aab9d-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:51:03.502773Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmipn5kt kind= uid=165aab9d-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:51:03.503846Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:03.504024Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:51:03.558162Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:51:03.567145Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:51:04.120492Z pos=monitor.go:222 component=virt-launcher msg="Found PID for e9e1e65c-43ce-4142-8bc7-95f3e591582a: 167" level=info timestamp=2018-07-31T14:51:05.563864Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmipn5kt kind= uid=165aab9d-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" Pod name: virt-launcher-testvmiqf9rp-mhkxl Pod phase: Running level=info timestamp=2018-07-31T14:55:20.251612Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:55:20.748822Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:55:20.754635Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:55:21.062156Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:55:21.170497Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:55:21.185744Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:55:21.186029Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:55:21.201051Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b48eab99-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:55:21.256624Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b48eab99-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:55:21.266220Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:55:21.270985Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 624fe7a2-3bc8-587f-86b1-87f80cefb018" level=info timestamp=2018-07-31T14:55:21.273256Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:55:21.308126Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:55:21.522177Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b48eab99-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:55:22.281341Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 624fe7a2-3bc8-587f-86b1-87f80cefb018: 163" Pod name: virt-launcher-testvmirfgpm-822m4 Pod phase: Running level=info timestamp=2018-07-31T14:56:12.029114Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:56:12.065028Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:56:12.228501Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:56:12.257890Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:56:12.264239Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:56:12.266929Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:56:12.271605Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:56:12.271865Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:56:12.299013Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:56:12.308447Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:56:12.397347Z pos=converter.go:535 component=virt-launcher msg="The network interface type of default was changed to e1000 due to unsupported interface type by qemu slirp network" level=info timestamp=2018-07-31T14:56:12.398102Z pos=converter.go:751 component=virt-launcher msg="Found nameservers in /etc/resolv.conf: \n`\u0000\n" level=info timestamp=2018-07-31T14:56:12.398156Z pos=converter.go:752 component=virt-launcher msg="Found search domains in /etc/resolv.conf: kubevirt-test-default.svc.cluster.local svc.cluster.local cluster.local" level=info timestamp=2018-07-31T14:56:12.401463Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:56:13.127670Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 3f3b6e51-0fce-4328-8e8d-6f8e807b9f60: 159" • Failure [4.415 seconds] Slirp /root/go/src/kubevirt.io/kubevirt/tests/vmi_slirp_interface_test.go:39 should be able to /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 VirtualMachineInstance with slirp interface with custom MAC address [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Expected error: : { Err: { s: "command terminated with exit code 126", }, Code: 126, } command terminated with exit code 126 not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/vmi_slirp_interface_test.go:88 ------------------------------ STEP: have containerPort in the pod manifest STEP: start the virtual machine with slirp interface level=info timestamp=2018-07-31T14:57:37.540147Z pos=vmi_slirp_interface_test.go:87 component=tests msg= Pod name: disks-images-provider-hppz6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jnzcs Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-27kb9 Pod phase: Running level=error timestamp=2018-07-31T14:58:36.595920Z pos=subresource.go:85 component=virt-api msg= 2018/07/31 14:58:36 http: response.WriteHeader on hijacked connection level=error timestamp=2018-07-31T14:58:36.597261Z pos=subresource.go:97 component=virt-api reason="read tcp 10.244.1.3:8443->10.244.0.0:48028: use of closed network connection" msg="error ecountered reading from websocket stream" level=info timestamp=2018-07-31T14:58:36.597790Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmiktr5d/console proto=HTTP/1.1 statusCode=200 contentLength=0 level=info timestamp=2018-07-31T14:58:41.021519Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-31T14:58:41.051116Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-31T14:58:41.292476Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 14:58:43 http: TLS handshake error from 10.244.1.1:46384: EOF level=info timestamp=2018-07-31T14:58:48.240426Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-31T14:58:48.246129Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 14:58:53 http: TLS handshake error from 10.244.1.1:46390: EOF level=info timestamp=2018-07-31T14:58:57.705700Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T14:58:57.708830Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/07/31 14:59:03 http: TLS handshake error from 10.244.1.1:46396: EOF level=info timestamp=2018-07-31T14:59:11.143389Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-api-7d79764579-xrlr5 Pod phase: Running level=error timestamp=2018-07-31T14:57:42.485431Z pos=subresource.go:97 component=virt-api reason="read tcp 10.244.0.21:8443->10.244.0.1:47866: use of closed network connection" msg="error ecountered reading from websocket stream" level=info timestamp=2018-07-31T14:57:42.485573Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi4qj5k/console proto=HTTP/1.1 statusCode=200 contentLength=0 level=error timestamp=2018-07-31T14:57:46.746655Z pos=subresource.go:85 component=virt-api msg="connection failed: command terminated with exit code 137" 2018/07/31 14:57:46 http: response.WriteHeader on hijacked connection level=error timestamp=2018-07-31T14:57:46.760460Z pos=subresource.go:97 component=virt-api reason="read tcp 10.244.0.21:8443->10.244.0.1:47006: use of closed network connection" msg="error ecountered reading from websocket stream" level=info timestamp=2018-07-31T14:57:46.762937Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmidhbnn/console proto=HTTP/1.1 statusCode=500 contentLength=0 2018/07/31 14:57:48 http: TLS handshake error from 10.244.0.1:33228: EOF 2018/07/31 14:57:58 http: TLS handshake error from 10.244.0.1:33252: EOF 2018/07/31 14:58:08 http: TLS handshake error from 10.244.0.1:33282: EOF 2018/07/31 14:58:18 http: TLS handshake error from 10.244.0.1:33306: EOF 2018/07/31 14:58:28 http: TLS handshake error from 10.244.0.1:33330: EOF 2018/07/31 14:58:38 http: TLS handshake error from 10.244.0.1:33354: EOF 2018/07/31 14:58:48 http: TLS handshake error from 10.244.0.1:33378: EOF 2018/07/31 14:58:58 http: TLS handshake error from 10.244.0.1:33402: EOF 2018/07/31 14:59:08 http: TLS handshake error from 10.244.0.1:33426: EOF Pod name: virt-controller-7d57d96b65-c5mrh Pod phase: Running level=info timestamp=2018-07-31T14:47:29.666482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-fmt47 Pod phase: Running level=info timestamp=2018-07-31T14:55:03.482306Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:55:03.482394Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:55:19.965319Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:55:19.978023Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:55:21.518982Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:55:21.519110Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:55:21.572218Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Started processing VM" level=info timestamp=2018-07-31T14:55:21.572317Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqf9rp kind= uid=b45d2823-94d1-11e8-bcd5-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-07-31T14:55:53.409538Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T14:55:53.410311Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirfgpm kind= uid=d263e4b8-94d1-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T14:56:44.678271Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T14:56:44.684287Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4qj5k kind= uid=f0f21a79-94d1-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T14:56:45.080222Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4qj5k\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4qj5k" level=info timestamp=2018-07-31T14:57:44.354213Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiktr5d kind= uid=148766eb-94d2-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T14:57:44.360362Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiktr5d kind= uid=148766eb-94d2-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-gr5ql Pod phase: Running level=info timestamp=2018-07-31T14:58:07.413782Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T14:58:07.413958Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmihjws7 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T14:58:07.414148Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmihjws7 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:58:07.414511Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi2m9wjb6n2b, existing: false\n" level=info timestamp=2018-07-31T14:58:07.414567Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T14:58:07.414650Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi2m9wjb6n2b kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T14:58:07.414830Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi2m9wjb6n2b kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:58:07.416080Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi4qj5k, existing: false\n" level=info timestamp=2018-07-31T14:58:07.416255Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T14:58:07.416522Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi4qj5k kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T14:58:07.416710Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi4qj5k kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:58:07.416934Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi5s9j4, existing: false\n" level=info timestamp=2018-07-31T14:58:07.417008Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T14:58:07.417108Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi5s9j4 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T14:58:07.417270Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi5s9j4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-s6p2b Pod phase: Running level=info timestamp=2018-07-31T14:58:06.969402Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmiqf9rp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T14:58:06.969600Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmiqf9rp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:58:06.971776Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmij4kc4, existing: false\n" level=info timestamp=2018-07-31T14:58:06.972089Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T14:58:06.972273Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmij4kc4 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T14:58:06.972436Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmij4kc4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:58:14.035134Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmirfgpm kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-31T14:58:14.037478Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmirfgpm, existing: false\n" level=info timestamp=2018-07-31T14:58:14.037614Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T14:58:14.038108Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirfgpm kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T14:58:14.039212Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmirfgpm kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T14:58:14.040466Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmirfgpm, existing: false\n" level=info timestamp=2018-07-31T14:58:14.040858Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T14:58:14.041117Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmirfgpm kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T14:58:14.041509Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmirfgpm kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmiktr5d-mnfff Pod phase: Running level=info timestamp=2018-07-31T14:58:03.099135Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T14:58:03.990537Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T14:58:04.001624Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:58:04.033258Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 118b1dae-ecf0-4c8e-8dc1-25916255c7a6" level=info timestamp=2018-07-31T14:58:04.033692Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T14:58:04.804250Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T14:58:04.852653Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:58:04.856748Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:58:04.858288Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T14:58:04.887833Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmiktr5d kind= uid=148766eb-94d2-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T14:58:04.890860Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiktr5d kind= uid=148766eb-94d2-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:58:04.891916Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T14:58:04.897414Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T14:58:04.939058Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmiktr5d kind= uid=148766eb-94d2-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T14:58:05.040582Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 118b1dae-ecf0-4c8e-8dc1-25916255c7a6: 165" 2018/07/31 10:59:12 read closing down: EOF • Failure [90.461 seconds] Health Monitoring /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:37 A VirtualMachineInstance with a watchdog device /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:56 should be shut down when the watchdog expires [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:57 Timed out after 40.011s. Expected : Running to equal : Failed /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:85 ------------------------------ STEP: Starting a VirtualMachineInstance level=info timestamp=2018-07-31T14:57:45.250178Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiktr5d kind=VirtualMachineInstance uid=148766eb-94d2-11e8-bcd5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmiktr5d-mnfff" level=info timestamp=2018-07-31T14:58:02.411526Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiktr5d kind=VirtualMachineInstance uid=148766eb-94d2-11e8-bcd5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmiktr5d-mnfff" level=info timestamp=2018-07-31T14:58:04.519460Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiktr5d kind=VirtualMachineInstance uid=148766eb-94d2-11e8-bcd5-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-31T14:58:04.551540Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiktr5d kind=VirtualMachineInstance uid=148766eb-94d2-11e8-bcd5-525500d15501 msg="VirtualMachineInstance started." STEP: Expecting the VirtualMachineInstance console STEP: Killing the watchdog device STEP: Checking that the VirtualMachineInstance has Failed status • [SLOW TEST:19.326 seconds] VNC /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:54 with VNC connection /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:62 should allow accessing the VNC device /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:64 ------------------------------ •• ------------------------------ • [SLOW TEST:5.183 seconds] Subresource Api /root/go/src/kubevirt.io/kubevirt/tests/subresource_api_test.go:37 Rbac Authorization /root/go/src/kubevirt.io/kubevirt/tests/subresource_api_test.go:48 with correct permissions /root/go/src/kubevirt.io/kubevirt/tests/subresource_api_test.go:51 should be allowed to access subresource endpoint /root/go/src/kubevirt.io/kubevirt/tests/subresource_api_test.go:52 ------------------------------ •••volumedisk0 compute ------------------------------ • [SLOW TEST:41.612 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 VirtualMachineInstance definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:55 with 3 CPU cores /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:56 should report 3 cpu cores under guest OS 2018/07/31 11:00:32 read closing down: EOF /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:62 ------------------------------ • ------------------------------ • [SLOW TEST:18.395 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 VirtualMachineInstance definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:55 with hugepages /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:164 should consume hugepages /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 hugepages-2Mi /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ S [SKIPPING] [0.230 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 VirtualMachineInstance definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:55 with hugepages /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:164 should consume hugepages /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 hugepages-1Gi [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 No node with hugepages hugepages-1Gi capacity /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:216 ------------------------------ • ------------------------------ • [SLOW TEST:99.389 seconds] 2018/07/31 11:02:33 read closing down: EOF Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 with CPU spec /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:294 when CPU model defined /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:340 should report defined CPU model /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:341 ------------------------------ 2018/07/31 11:04:30 read closing down: EOF • [SLOW TEST:116.551 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 with CPU spec /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:294 when CPU model equals to passthrough /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:368 should report exactly the same model as node CPU /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:369 ------------------------------ 2018/07/31 11:06:19 read closing down: EOF • [SLOW TEST:109.758 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 with CPU spec /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:294 when CPU model not defined /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:392 should report CPU model from libvirt capabilities /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:393 ------------------------------ • [SLOW TEST:53.122 seconds] 2018/07/31 11:07:12 read closing down: EOF Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 New VirtualMachineInstance with all supported drives /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:413 should have all the device nodes /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:436 ------------------------------ • ------------------------------ • [SLOW TEST:19.793 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 should start it /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:80 ------------------------------ • [SLOW TEST:19.413 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 should attach virt-launcher to it /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:86 ------------------------------ ••••2018/07/31 11:08:34 read closing down: EOF ------------------------------ • [SLOW TEST:40.727 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 with boot order /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:174 should be able to boot from selected disk /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 Alpine as first boot /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ 2018/07/31 11:09:02 read closing down: EOF • [SLOW TEST:28.252 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 with boot order /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:174 should be able to boot from selected disk /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 Cirros as first boot /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:16.806 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 with user-data /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:205 without k8s secret /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:206 should retry starting the VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:207 ------------------------------ • [SLOW TEST:17.873 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 with user-data /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:205 without k8s secret /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:206 should log warning and proceed once the secret is there /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:237 ------------------------------ • [SLOW TEST:36.362 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 when virt-launcher crashes /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:285 should be stopped and have Failed phase /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:286 ------------------------------ Pod name: disks-images-provider-hppz6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jnzcs Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-27kb9 Pod phase: Running 2018/07/31 15:10:34 http: TLS handshake error from 10.244.1.1:46828: EOF level=info timestamp=2018-07-31T15:10:34.805401Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T15:10:34.964964Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T15:10:41.330953Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 15:10:43 http: TLS handshake error from 10.244.1.1:46834: EOF 2018/07/31 15:10:53 http: TLS handshake error from 10.244.1.1:46840: EOF 2018/07/31 15:11:04 http: TLS handshake error from 10.244.1.1:46846: EOF level=info timestamp=2018-07-31T15:11:04.974781Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T15:11:05.093114Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T15:11:11.137703Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 15:11:13 http: TLS handshake error from 10.244.1.1:46852: EOF 2018/07/31 15:11:23 http: TLS handshake error from 10.244.1.1:46858: EOF 2018/07/31 15:11:33 http: TLS handshake error from 10.244.1.1:46864: EOF level=info timestamp=2018-07-31T15:11:35.106770Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T15:11:35.230129Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-xrlr5 Pod phase: Running 2018/07/31 15:09:08 http: TLS handshake error from 10.244.0.1:34968: EOF 2018/07/31 15:09:18 http: TLS handshake error from 10.244.0.1:34992: EOF 2018/07/31 15:09:28 http: TLS handshake error from 10.244.0.1:35016: EOF 2018/07/31 15:09:38 http: TLS handshake error from 10.244.0.1:35040: EOF 2018/07/31 15:09:48 http: TLS handshake error from 10.244.0.1:35064: EOF 2018/07/31 15:09:58 http: TLS handshake error from 10.244.0.1:35088: EOF 2018/07/31 15:10:08 http: TLS handshake error from 10.244.0.1:35112: EOF 2018/07/31 15:10:18 http: TLS handshake error from 10.244.0.1:35136: EOF 2018/07/31 15:10:28 http: TLS handshake error from 10.244.0.1:35160: EOF 2018/07/31 15:10:38 http: TLS handshake error from 10.244.0.1:35184: EOF 2018/07/31 15:10:48 http: TLS handshake error from 10.244.0.1:35210: EOF 2018/07/31 15:10:58 http: TLS handshake error from 10.244.0.1:35234: EOF 2018/07/31 15:11:08 http: TLS handshake error from 10.244.0.1:35258: EOF 2018/07/31 15:11:18 http: TLS handshake error from 10.244.0.1:35282: EOF 2018/07/31 15:11:28 http: TLS handshake error from 10.244.0.1:35306: EOF Pod name: virt-controller-7d57d96b65-c5mrh Pod phase: Running level=info timestamp=2018-07-31T14:47:29.666482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-fmt47 Pod phase: Running level=info timestamp=2018-07-31T15:08:35.149658Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixmvd9 kind= uid=98705c2b-94d3-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:08:35.151121Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixmvd9 kind= uid=98705c2b-94d3-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:08:35.333056Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixmvd9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixmvd9" level=info timestamp=2018-07-31T15:09:03.499948Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7lpmq kind= uid=a950c7c6-94d3-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:09:03.500189Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7lpmq kind= uid=a950c7c6-94d3-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:09:03.842064Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi7lpmq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi7lpmq" level=info timestamp=2018-07-31T15:09:03.867475Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi7lpmq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi7lpmq" level=info timestamp=2018-07-31T15:09:20.182373Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibbnfw kind= uid=b3470e94-94d3-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:09:20.184963Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibbnfw kind= uid=b3470e94-94d3-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:09:38.294915Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw77zn kind= uid=be11ddc9-94d3-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:09:38.295543Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw77zn kind= uid=be11ddc9-94d3-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:09:38.410323Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiw77zn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiw77zn" level=info timestamp=2018-07-31T15:10:14.626586Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipk2kn kind= uid=d3baa194-94d3-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:10:14.628508Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipk2kn kind= uid=d3baa194-94d3-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:10:14.829374Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipk2kn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipk2kn" Pod name: virt-handler-gr5ql Pod phase: Running level=error timestamp=2018-07-31T15:10:28.843631Z pos=vm.go:424 component=virt-handler namespace=kubevirt-test-default name=testvmibbnfw kind=VirtualMachineInstance uid= reason="connection is shut down" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-07-31T15:10:28.843710Z pos=vm.go:251 component=virt-handler reason="connection is shut down" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibbnfw" level=info timestamp=2018-07-31T15:10:34.285004Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmibbnfw kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-31T15:10:34.285510Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmibbnfw, existing: false\n" level=info timestamp=2018-07-31T15:10:34.285569Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T15:10:34.285766Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmibbnfw kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T15:10:34.286880Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmibbnfw kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T15:10:34.287892Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmibbnfw, existing: false\n" level=info timestamp=2018-07-31T15:10:34.288022Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T15:10:34.288184Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmibbnfw kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T15:10:34.288531Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmibbnfw kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T15:10:49.325112Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmibbnfw, existing: false\n" level=info timestamp=2018-07-31T15:10:49.325978Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T15:10:49.326475Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmibbnfw kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T15:10:49.326837Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmibbnfw kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-s6p2b Pod phase: Running level=info timestamp=2018-07-31T15:10:38.288281Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmi7lpmq_sock" level=error timestamp=2018-07-31T15:10:38.288551Z pos=cache.go:124 component=virt-handler reason="dial unix /var/run/kubevirt/sockets/kubevirt-test-default_testvmi7lpmq_sock: connect: connection refused" msg="failed to connect to cmd client socket" level=info timestamp=2018-07-31T15:10:38.288602Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmigzfvb_sock" level=error timestamp=2018-07-31T15:10:38.288656Z pos=cache.go:124 component=virt-handler reason="dial unix /var/run/kubevirt/sockets/kubevirt-test-default_testvmigzfvb_sock: connect: connection refused" msg="failed to connect to cmd client socket" level=info timestamp=2018-07-31T15:10:38.290114Z pos=cache.go:121 component=virt-handler msg="List domains from sock /var/run/kubevirt/sockets/kubevirt-test-default_testvmipk2kn_sock" level=info timestamp=2018-07-31T15:10:38.385228Z pos=vm.go:725 component=virt-handler namespace=kubevirt-test-default name=testvmipk2kn kind=Domain uid= msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-31T15:10:38.385494Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-07-31T15:10:38.457909Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-07-31T15:10:38.492043Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmipk2kn, existing: true\n" level=info timestamp=2018-07-31T15:10:38.495904Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-31T15:10:38.495982Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T15:10:38.496006Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T15:10:38.496086Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmipk2kn kind=VirtualMachineInstance uid=d3baa194-94d3-11e8-bcd5-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-31T15:10:38.577179Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" level=info timestamp=2018-07-31T15:10:38.613258Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmipk2kn kind=VirtualMachineInstance uid=d3baa194-94d3-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmipk2kn-zztlv Pod phase: Running level=info timestamp=2018-07-31T15:10:31.926985Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T15:10:31.936451Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 577eb4e2-90bb-455d-a5ba-76b98df9f810" level=info timestamp=2018-07-31T15:10:31.936776Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T15:10:31.942520Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T15:10:32.565894Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T15:10:32.628855Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T15:10:32.638751Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T15:10:32.638882Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T15:10:32.675259Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmipk2kn kind= uid=d3baa194-94d3-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T15:10:32.684088Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmipk2kn kind= uid=d3baa194-94d3-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T15:10:32.699177Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T15:10:32.730848Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T15:10:32.814270Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmipk2kn kind= uid=d3baa194-94d3-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T15:10:32.943893Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 577eb4e2-90bb-455d-a5ba-76b98df9f810: 160" level=info timestamp=2018-07-31T15:10:38.605899Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmipk2kn kind=VirtualMachineInstance uid=d3baa194-94d3-11e8-bcd5-525500d15501 msg="Synced vmi" Pod name: vmi-killerdbl8p Pod phase: Succeeded Pod name: vmi-killergpthh Pod phase: Succeeded • Failure [83.762 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 when virt-handler crashes /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:309 should recover and continue management [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:310 Expected : Running to equal : Failed /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:336 ------------------------------ level=info timestamp=2018-07-31T15:10:14.966652Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmipk2kn kind=VirtualMachineInstance uid=d3baa194-94d3-11e8-bcd5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmipk2kn-zztlv" level=info timestamp=2018-07-31T15:10:30.442616Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmipk2kn kind=VirtualMachineInstance uid=d3baa194-94d3-11e8-bcd5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmipk2kn-zztlv" level=info timestamp=2018-07-31T15:10:32.681663Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmipk2kn kind=VirtualMachineInstance uid=d3baa194-94d3-11e8-bcd5-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-31T15:10:32.736077Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmipk2kn kind=VirtualMachineInstance uid=d3baa194-94d3-11e8-bcd5-525500d15501 msg="VirtualMachineInstance started." STEP: Crashing the virt-handler STEP: Killing the VirtualMachineInstance level=info timestamp=2018-07-31T15:10:37.272470Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmipk2kn kind=VirtualMachineInstance uid=d3baa194-94d3-11e8-bcd5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmipk2kn-zztlv" level=info timestamp=2018-07-31T15:10:37.272645Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmipk2kn kind=VirtualMachineInstance uid=d3baa194-94d3-11e8-bcd5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmipk2kn-zztlv" level=info timestamp=2018-07-31T15:10:37.273317Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmipk2kn kind=VirtualMachineInstance uid=d3baa194-94d3-11e8-bcd5-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-31T15:10:37.273444Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmipk2kn kind=VirtualMachineInstance uid=d3baa194-94d3-11e8-bcd5-525500d15501 msg="VirtualMachineInstance started." level=info timestamp=2018-07-31T15:10:38.547733Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmipk2kn kind=VirtualMachineInstance uid=d3baa194-94d3-11e8-bcd5-525500d15501 msg="VirtualMachineInstance defined." STEP: Checking that VirtualMachineInstance has 'Failed' phase • [SLOW TEST:33.114 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 when virt-handler is responsive /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:340 should indicate that a node is ready for vmis /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:341 ------------------------------ • [SLOW TEST:97.262 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 when virt-handler is not responsive /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:371 the node controller should react /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:410 ------------------------------ • [SLOW TEST:18.417 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 with node tainted /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:463 the vmi with tolerations should be scheduled /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:485 ------------------------------ • ------------------------------ • [SLOW TEST:66.153 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 with non default namespace /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:535 should log libvirt start and stop lifecycle events of the domain /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 kubevirt-test-default /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ • [SLOW TEST:74.447 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 with non default namespace /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:535 should log libvirt start and stop lifecycle events of the domain /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 kubevirt-test-alternative /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.335 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 VirtualMachineInstance Emulation Mode /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:592 should enable emulation in virt-launcher [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:604 Software emulation is not enabled on this cluster /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:600 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.281 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 VirtualMachineInstance Emulation Mode /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:592 should be reflected in domain XML [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:641 Software emulation is not enabled on this cluster /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:600 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.145 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 VirtualMachineInstance Emulation Mode /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:592 should request a TUN device but not KVM [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:685 Software emulation is not enabled on this cluster /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:600 ------------------------------ •••• ------------------------------ • [SLOW TEST:72.558 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Delete a VirtualMachineInstance's Pod /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:837 should result in the VirtualMachineInstance moving to a finalized state /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:838 ------------------------------ • [SLOW TEST:51.701 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Delete a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:869 with an active pod. /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:870 should result in pod being terminated /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:871 ------------------------------ 2018/07/31 11:19:19 read closing down: EOF Pod name: disks-images-provider-hppz6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jnzcs Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-27kb9 Pod phase: Running level=info timestamp=2018-07-31T15:18:37.451793Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T15:18:37.488571Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T15:18:41.030952Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-31T15:18:41.059231Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-31T15:18:41.367667Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 15:18:44 http: TLS handshake error from 10.244.1.1:47124: EOF level=info timestamp=2018-07-31T15:18:49.820903Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-31T15:18:49.827803Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 15:18:54 http: TLS handshake error from 10.244.1.1:47130: EOF 2018/07/31 15:19:03 http: TLS handshake error from 10.244.1.1:47136: EOF level=info timestamp=2018-07-31T15:19:07.578631Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T15:19:07.589379Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T15:19:11.167131Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 15:19:13 http: TLS handshake error from 10.244.1.1:47142: EOF 2018/07/31 15:19:23 http: TLS handshake error from 10.244.1.1:47148: EOF Pod name: virt-api-7d79764579-xrlr5 Pod phase: Running 2018/07/31 15:17:48 http: TLS handshake error from 10.244.0.1:36222: EOF 2018/07/31 15:17:58 http: TLS handshake error from 10.244.0.1:36246: EOF 2018/07/31 15:18:08 http: TLS handshake error from 10.244.0.1:36270: EOF 2018/07/31 15:18:18 http: TLS handshake error from 10.244.0.1:36294: EOF 2018/07/31 15:18:28 http: TLS handshake error from 10.244.0.1:36318: EOF 2018/07/31 15:18:38 http: TLS handshake error from 10.244.0.1:36342: EOF 2018/07/31 15:18:48 http: TLS handshake error from 10.244.0.1:36366: EOF level=info timestamp=2018-07-31T15:18:53.897749Z pos=subresource.go:75 component=virt-api msg="Websocket connection upgraded" 2018/07/31 15:18:58 http: TLS handshake error from 10.244.0.1:36398: EOF 2018/07/31 15:19:08 http: TLS handshake error from 10.244.0.1:36422: EOF 2018/07/31 15:19:18 http: TLS handshake error from 10.244.0.1:36446: EOF level=error timestamp=2018-07-31T15:19:20.643465Z pos=subresource.go:85 component=virt-api msg= 2018/07/31 15:19:20 http: response.WriteHeader on hijacked connection level=error timestamp=2018-07-31T15:19:20.646139Z pos=subresource.go:97 component=virt-api reason="read tcp 10.244.0.21:8443->10.244.0.1:51140: use of closed network connection" msg="error ecountered reading from websocket stream" level=info timestamp=2018-07-31T15:19:20.647036Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi4ffj7/console proto=HTTP/1.1 statusCode=200 contentLength=0 Pod name: virt-controller-7d57d96b65-c5mrh Pod phase: Running level=info timestamp=2018-07-31T14:47:29.666482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-fmt47 Pod phase: Running level=info timestamp=2018-07-31T15:15:15.126970Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8dnbd\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-alternative/testvmi8dnbd" level=info timestamp=2018-07-31T15:16:29.926432Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9gzrx kind= uid=b36ccb75-94d4-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:16:29.928041Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9gzrx kind= uid=b36ccb75-94d4-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:16:30.803243Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivh6bq kind= uid=b3f2b4bc-94d4-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:16:30.803847Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivh6bq kind= uid=b3f2b4bc-94d4-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:16:30.916234Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivh6bq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivh6bq" level=info timestamp=2018-07-31T15:16:30.978582Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivh6bq\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmivh6bq, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: b3f2b4bc-94d4-11e8-bcd5-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivh6bq" level=info timestamp=2018-07-31T15:16:31.537353Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmix5nnv kind= uid=b4639da4-94d4-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:16:31.537755Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmix5nnv kind= uid=b4639da4-94d4-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:17:44.209613Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilw8gs kind= uid=dfb2764e-94d4-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:17:44.213371Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilw8gs kind= uid=dfb2764e-94d4-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:17:44.840496Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmilw8gs\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmilw8gs" level=info timestamp=2018-07-31T15:18:36.063541Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ffj7 kind= uid=fe95afb9-94d4-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:18:36.065304Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ffj7 kind= uid=fe95afb9-94d4-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:18:36.267798Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4ffj7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4ffj7" Pod name: virt-handler-qrr9q Pod phase: Running level=info timestamp=2018-07-31T15:19:00.088413Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmilw8gs, existing: false\n" level=info timestamp=2018-07-31T15:19:00.088492Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T15:19:00.088592Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmilw8gs kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T15:19:00.088707Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmilw8gs kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T15:19:12.187027Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmilw8gs, existing: false\n" level=info timestamp=2018-07-31T15:19:12.187901Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T15:19:12.188471Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmilw8gs kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T15:19:12.189230Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmilw8gs kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T15:19:20.399548Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi4ffj7, existing: true\n" level=info timestamp=2018-07-31T15:19:20.399835Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-31T15:19:20.399961Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T15:19:20.400045Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T15:19:20.400426Z pos=vm.go:370 component=virt-handler namespace=kubevirt-test-default name=testvmi4ffj7 kind= uid=fe95afb9-94d4-11e8-bcd5-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-31T15:19:20.400630Z pos=vm.go:407 component=virt-handler namespace=kubevirt-test-default name=testvmi4ffj7 kind= uid=fe95afb9-94d4-11e8-bcd5-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-31T15:19:20.405810Z pos=vm.go:556 component=virt-handler namespace=kubevirt-test-default name=testvmi4ffj7 kind= uid=fe95afb9-94d4-11e8-bcd5-525500d15501 msg="Grace period expired, killing deleted VirtualMachineInstance testvmi4ffj7" Pod name: virt-handler-vfnhl Pod phase: Running level=info timestamp=2018-07-31T15:17:43.393602Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T15:17:43.394001Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmix5nnv kind= uid=b4639da4-94d4-11e8-bcd5-525500d15501 msg="No update processing required" level=info timestamp=2018-07-31T15:17:43.394190Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmix5nnv kind= uid=b4639da4-94d4-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T15:17:44.050163Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmix5nnv, existing: false\n" level=info timestamp=2018-07-31T15:17:44.050327Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T15:17:44.050478Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmix5nnv kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T15:17:44.052419Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmix5nnv kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T15:17:44.052606Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmix5nnv, existing: false\n" level=info timestamp=2018-07-31T15:17:44.052656Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T15:17:44.053660Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmix5nnv kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T15:17:44.055404Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmix5nnv kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T15:18:00.407630Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmix5nnv, existing: false\n" level=info timestamp=2018-07-31T15:18:00.407930Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T15:18:00.408128Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmix5nnv kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T15:18:00.408417Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmix5nnv kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi4ffj7-zc5nz Pod phase: Running level=info timestamp=2018-07-31T15:18:53.225938Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T15:18:53.227754Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 0d66b2e6-2035-41b0-bd09-c76b1e55845e" level=info timestamp=2018-07-31T15:18:53.228062Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T15:18:53.233568Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T15:18:53.801941Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T15:18:53.853861Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T15:18:53.859596Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmi4ffj7 kind= uid=fe95afb9-94d4-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T15:18:53.865874Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T15:18:53.865930Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi4ffj7 kind= uid=fe95afb9-94d4-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T15:18:53.866197Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T15:18:53.902484Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T15:18:53.928372Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T15:18:53.972956Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmi4ffj7 kind= uid=fe95afb9-94d4-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T15:18:54.236778Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 0d66b2e6-2035-41b0-bd09-c76b1e55845e: 163" level=info timestamp=2018-07-31T15:19:20.464557Z pos=monitor.go:266 component=virt-launcher msg="Received signal 15." • Failure [49.891 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Delete a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:869 with ACPI and 0 grace period seconds /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:895 should result in vmi status failed [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:896 Timed out after 5.000s. Expected : Running to equal : Failed /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:917 ------------------------------ STEP: Creating the VirtualMachineInstance level=info timestamp=2018-07-31T15:18:36.201441Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi4ffj7 kind=VirtualMachineInstance uid=fe95afb9-94d4-11e8-bcd5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi4ffj7-zc5nz" level=info timestamp=2018-07-31T15:18:52.175890Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi4ffj7 kind=VirtualMachineInstance uid=fe95afb9-94d4-11e8-bcd5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi4ffj7-zc5nz" level=info timestamp=2018-07-31T15:18:53.558594Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi4ffj7 kind=VirtualMachineInstance uid=fe95afb9-94d4-11e8-bcd5-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-31T15:18:53.598058Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi4ffj7 kind=VirtualMachineInstance uid=fe95afb9-94d4-11e8-bcd5-525500d15501 msg="VirtualMachineInstance started." STEP: Deleting the VirtualMachineInstance STEP: Verifying VirtualMachineInstance's status is Failed 2018/07/31 11:20:15 read closing down: EOF Pod name: disks-images-provider-hppz6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jnzcs Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-27kb9 Pod phase: Running 2018/07/31 15:19:23 http: TLS handshake error from 10.244.1.1:47148: EOF 2018/07/31 15:19:34 http: TLS handshake error from 10.244.1.1:47154: EOF level=info timestamp=2018-07-31T15:19:37.755612Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T15:19:37.765438Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T15:19:41.284576Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 15:19:43 http: TLS handshake error from 10.244.1.1:47160: EOF level=info timestamp=2018-07-31T15:19:49.836759Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-31T15:19:49.846017Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 15:19:53 http: TLS handshake error from 10.244.1.1:47166: EOF 2018/07/31 15:20:04 http: TLS handshake error from 10.244.1.1:47172: EOF level=info timestamp=2018-07-31T15:20:07.942644Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T15:20:07.947541Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T15:20:11.479936Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 15:20:13 http: TLS handshake error from 10.244.1.1:47178: EOF 2018/07/31 15:20:23 http: TLS handshake error from 10.244.1.1:47184: EOF Pod name: virt-api-7d79764579-xrlr5 Pod phase: Running level=error timestamp=2018-07-31T15:19:20.646139Z pos=subresource.go:97 component=virt-api reason="read tcp 10.244.0.21:8443->10.244.0.1:51140: use of closed network connection" msg="error ecountered reading from websocket stream" level=info timestamp=2018-07-31T15:19:20.647036Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi4ffj7/console proto=HTTP/1.1 statusCode=200 contentLength=0 2018/07/31 15:19:28 http: TLS handshake error from 10.244.0.1:36470: EOF 2018/07/31 15:19:38 http: TLS handshake error from 10.244.0.1:36494: EOF level=info timestamp=2018-07-31T15:19:45.582589Z pos=subresource.go:75 component=virt-api msg="Websocket connection upgraded" 2018/07/31 15:19:48 http: TLS handshake error from 10.244.0.1:36526: EOF 2018/07/31 15:19:58 http: TLS handshake error from 10.244.0.1:36550: EOF 2018/07/31 15:20:08 http: TLS handshake error from 10.244.0.1:36574: EOF level=error timestamp=2018-07-31T15:20:16.026979Z pos=subresource.go:97 component=virt-api reason="websocket: close 1006 (abnormal closure): unexpected EOF" msg="error ecountered reading from websocket stream" level=error timestamp=2018-07-31T15:20:16.035139Z pos=subresource.go:106 component=virt-api reason="websocket: close 1006 (abnormal closure): unexpected EOF" msg="Error in websocket proxy" 2018/07/31 15:20:16 http: response.WriteHeader on hijacked connection level=info timestamp=2018-07-31T15:20:16.036459Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmizm92q/console proto=HTTP/1.1 statusCode=500 contentLength=0 level=error timestamp=2018-07-31T15:20:16.061422Z pos=subresource.go:91 component=virt-api reason="tls: use of closed connection" msg="error ecountered reading from remote podExec stream" 2018/07/31 15:20:18 http: TLS handshake error from 10.244.0.1:36598: EOF 2018/07/31 15:20:28 http: TLS handshake error from 10.244.0.1:36622: EOF Pod name: virt-controller-7d57d96b65-c5mrh Pod phase: Running level=info timestamp=2018-07-31T14:47:29.666482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-fmt47 Pod phase: Running level=info timestamp=2018-07-31T15:16:30.803243Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivh6bq kind= uid=b3f2b4bc-94d4-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:16:30.803847Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivh6bq kind= uid=b3f2b4bc-94d4-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:16:30.916234Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivh6bq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivh6bq" level=info timestamp=2018-07-31T15:16:30.978582Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivh6bq\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmivh6bq, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: b3f2b4bc-94d4-11e8-bcd5-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivh6bq" level=info timestamp=2018-07-31T15:16:31.537353Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmix5nnv kind= uid=b4639da4-94d4-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:16:31.537755Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmix5nnv kind= uid=b4639da4-94d4-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:17:44.209613Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilw8gs kind= uid=dfb2764e-94d4-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:17:44.213371Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilw8gs kind= uid=dfb2764e-94d4-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:17:44.840496Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmilw8gs\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmilw8gs" level=info timestamp=2018-07-31T15:18:36.063541Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ffj7 kind= uid=fe95afb9-94d4-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:18:36.065304Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ffj7 kind= uid=fe95afb9-94d4-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:18:36.267798Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4ffj7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4ffj7" level=info timestamp=2018-07-31T15:19:25.929090Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:19:25.931523Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:19:26.410867Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizm92q\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizm92q" Pod name: virt-handler-qrr9q Pod phase: Running level=info timestamp=2018-07-31T15:20:16.474332Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T15:20:16.474361Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T15:20:16.474438Z pos=vm.go:344 component=virt-handler namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Shutting down due to graceful shutdown signal." level=info timestamp=2018-07-31T15:20:16.474478Z pos=vm.go:370 component=virt-handler namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-31T15:20:16.474502Z pos=vm.go:407 component=virt-handler namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-31T15:20:16.490274Z pos=vm.go:547 component=virt-handler namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Signaled graceful shutdown for testvmizm92q" level=info timestamp=2018-07-31T15:20:16.490487Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T15:20:26.491446Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmizm92q, existing: true\n" level=info timestamp=2018-07-31T15:20:26.491961Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-31T15:20:26.492076Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T15:20:26.492179Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T15:20:26.492621Z pos=vm.go:344 component=virt-handler namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Shutting down due to graceful shutdown signal." level=info timestamp=2018-07-31T15:20:26.492744Z pos=vm.go:370 component=virt-handler namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Shutting down domain for VirtualMachineInstance with deletion timestamp." level=info timestamp=2018-07-31T15:20:26.492802Z pos=vm.go:407 component=virt-handler namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Processing shutdown." level=info timestamp=2018-07-31T15:20:26.495747Z pos=vm.go:556 component=virt-handler namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Grace period expired, killing deleted VirtualMachineInstance testvmizm92q" Pod name: virt-handler-vfnhl Pod phase: Running level=info timestamp=2018-07-31T15:17:43.393602Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T15:17:43.394001Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmix5nnv kind= uid=b4639da4-94d4-11e8-bcd5-525500d15501 msg="No update processing required" level=info timestamp=2018-07-31T15:17:43.394190Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmix5nnv kind= uid=b4639da4-94d4-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T15:17:44.050163Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmix5nnv, existing: false\n" level=info timestamp=2018-07-31T15:17:44.050327Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T15:17:44.050478Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmix5nnv kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T15:17:44.052419Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmix5nnv kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T15:17:44.052606Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmix5nnv, existing: false\n" level=info timestamp=2018-07-31T15:17:44.052656Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T15:17:44.053660Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmix5nnv kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T15:17:44.055404Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmix5nnv kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T15:18:00.407630Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmix5nnv, existing: false\n" level=info timestamp=2018-07-31T15:18:00.407930Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T15:18:00.408128Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmix5nnv kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T15:18:00.408417Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmix5nnv kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmizm92q-ps4w9 Pod phase: Running level=info timestamp=2018-07-31T15:19:45.601410Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T15:19:45.613400Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T15:19:45.618990Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T15:19:45.626055Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T15:19:45.668939Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T15:19:45.712082Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 1efd8a96-a222-462b-80c3-c3a6ab7ba30d: 162" level=info timestamp=2018-07-31T15:20:16.228962Z pos=monitor.go:266 component=virt-launcher msg="Received signal 15." level=info timestamp=2018-07-31T15:20:16.235640Z pos=manager.go:255 component=virt-launcher namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Signaled graceful shutdown for testvmizm92q" level=info timestamp=2018-07-31T15:20:16.437035Z pos=server.go:118 component=virt-launcher namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Signaled vmi shutdown" level=info timestamp=2018-07-31T15:20:16.440766Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 1 received" level=info timestamp=2018-07-31T15:20:16.468741Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T15:20:16.472611Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T15:20:16.473593Z pos=server.go:118 component=virt-launcher namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Signaled vmi shutdown" level=info timestamp=2018-07-31T15:20:16.489955Z pos=server.go:118 component=virt-launcher namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Signaled vmi shutdown" level=info timestamp=2018-07-31T15:20:26.501365Z pos=client.go:136 component=virt-launcher msg="Libvirt event 6 with reason 1 received" • Failure [65.729 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Delete a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:869 with ACPI and some grace period seconds /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:920 should result in vmi status succeeded [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:921 Timed out after 15.000s. Expected : Running to equal : Succeeded /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:942 ------------------------------ STEP: Creating the VirtualMachineInstance level=info timestamp=2018-07-31T15:19:26.304667Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmizm92q kind=VirtualMachineInstance uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmizm92q-ps4w9" level=info timestamp=2018-07-31T15:19:42.968708Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmizm92q kind=VirtualMachineInstance uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmizm92q-ps4w9" level=info timestamp=2018-07-31T15:19:45.233240Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmizm92q kind=VirtualMachineInstance uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-31T15:19:45.260388Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmizm92q kind=VirtualMachineInstance uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="VirtualMachineInstance started." STEP: Deleting the VirtualMachineInstance STEP: Verifying VirtualMachineInstance's status is Succeeded • [SLOW TEST:54.241 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Delete a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:869 with grace period greater than 0 /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:945 should run graceful shutdown /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:946 ------------------------------ Pod name: disks-images-provider-hppz6 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-jnzcs Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-27kb9 Pod phase: Running 2018/07/31 15:21:43 http: TLS handshake error from 10.244.1.1:47232: EOF level=info timestamp=2018-07-31T15:21:49.861689Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-07-31T15:21:49.870724Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 15:21:54 http: TLS handshake error from 10.244.1.1:47238: EOF 2018/07/31 15:22:04 http: TLS handshake error from 10.244.1.1:47244: EOF level=info timestamp=2018-07-31T15:22:08.304509Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T15:22:08.590945Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T15:22:11.264482Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 15:22:13 http: TLS handshake error from 10.244.1.1:47250: EOF 2018/07/31 15:22:23 http: TLS handshake error from 10.244.1.1:47256: EOF 2018/07/31 15:22:33 http: TLS handshake error from 10.244.1.1:47262: EOF level=info timestamp=2018-07-31T15:22:38.464028Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T15:22:38.712874Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-07-31T15:22:41.170144Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/07/31 15:22:43 http: TLS handshake error from 10.244.1.1:47268: EOF Pod name: virt-api-7d79764579-xrlr5 Pod phase: Running 2018/07/31 15:20:28 http: TLS handshake error from 10.244.0.1:36622: EOF 2018/07/31 15:20:38 http: TLS handshake error from 10.244.0.1:36646: EOF 2018/07/31 15:20:48 http: TLS handshake error from 10.244.0.1:36670: EOF 2018/07/31 15:20:58 http: TLS handshake error from 10.244.0.1:36694: EOF 2018/07/31 15:21:08 http: TLS handshake error from 10.244.0.1:36718: EOF 2018/07/31 15:21:18 http: TLS handshake error from 10.244.0.1:36742: EOF 2018/07/31 15:21:28 http: TLS handshake error from 10.244.0.1:36766: EOF 2018/07/31 15:21:38 http: TLS handshake error from 10.244.0.1:36790: EOF 2018/07/31 15:21:48 http: TLS handshake error from 10.244.0.1:36814: EOF 2018/07/31 15:21:58 http: TLS handshake error from 10.244.0.1:36838: EOF 2018/07/31 15:22:08 http: TLS handshake error from 10.244.0.1:36862: EOF 2018/07/31 15:22:18 http: TLS handshake error from 10.244.0.1:36886: EOF 2018/07/31 15:22:28 http: TLS handshake error from 10.244.0.1:36910: EOF 2018/07/31 15:22:38 http: TLS handshake error from 10.244.0.1:36934: EOF 2018/07/31 15:22:48 http: TLS handshake error from 10.244.0.1:36958: EOF Pod name: virt-controller-7d57d96b65-c5mrh Pod phase: Running level=info timestamp=2018-07-31T14:47:29.666482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-fmt47 Pod phase: Running level=info timestamp=2018-07-31T15:17:44.209613Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilw8gs kind= uid=dfb2764e-94d4-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:17:44.213371Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilw8gs kind= uid=dfb2764e-94d4-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:17:44.840496Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmilw8gs\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmilw8gs" level=info timestamp=2018-07-31T15:18:36.063541Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ffj7 kind= uid=fe95afb9-94d4-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:18:36.065304Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ffj7 kind= uid=fe95afb9-94d4-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:18:36.267798Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4ffj7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4ffj7" level=info timestamp=2018-07-31T15:19:25.929090Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:19:25.931523Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizm92q kind= uid=1c52375a-94d5-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:19:26.410867Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizm92q\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizm92q" level=info timestamp=2018-07-31T15:20:31.632162Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6vvbd kind= uid=437cda71-94d5-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:20:31.632427Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6vvbd kind= uid=437cda71-94d5-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:20:31.874919Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6vvbd\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6vvbd" level=info timestamp=2018-07-31T15:21:25.822642Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihfp49 kind= uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-07-31T15:21:25.829484Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihfp49 kind= uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-07-31T15:21:26.093346Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmihfp49\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmihfp49" Pod name: virt-handler-qrr9q Pod phase: Running level=info timestamp=2018-07-31T15:21:43.236134Z pos=vm.go:756 component=virt-handler namespace=kubevirt-test-default name=testvmihfp49 kind=Domain uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="Domain is in state Running reason Unknown" level=info timestamp=2018-07-31T15:21:43.268371Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmihfp49 kind= uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T15:21:43.268486Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmihfp49, existing: true\n" level=info timestamp=2018-07-31T15:21:43.268530Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-07-31T15:21:43.268562Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T15:21:43.268584Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T15:21:43.268688Z pos=vm.go:419 component=virt-handler namespace=kubevirt-test-default name=testvmihfp49 kind= uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="No update processing required" level=info timestamp=2018-07-31T15:21:43.286429Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type MODIFIED" level=info timestamp=2018-07-31T15:21:43.332772Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmihfp49 kind= uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T15:21:43.332878Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmihfp49, existing: true\n" level=info timestamp=2018-07-31T15:21:43.332905Z pos=vm.go:315 component=virt-handler msg="vmi is in phase: Running\n" level=info timestamp=2018-07-31T15:21:43.332936Z pos=vm.go:329 component=virt-handler msg="Domain: existing: true\n" level=info timestamp=2018-07-31T15:21:43.332956Z pos=vm.go:331 component=virt-handler msg="Domain status: Running, reason: Unknown\n" level=info timestamp=2018-07-31T15:21:43.333056Z pos=vm.go:416 component=virt-handler namespace=kubevirt-test-default name=testvmihfp49 kind= uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="Processing vmi update" level=info timestamp=2018-07-31T15:21:43.337051Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmihfp49 kind= uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-vfnhl Pod phase: Running level=error timestamp=2018-07-31T15:21:33.546791Z pos=vm.go:424 component=virt-handler namespace=kubevirt-test-default name=testvmi6vvbd kind=VirtualMachineInstance uid= reason="connection is shut down" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-07-31T15:21:33.546887Z pos=vm.go:251 component=virt-handler reason="connection is shut down" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi6vvbd" level=info timestamp=2018-07-31T15:21:43.321475Z pos=vm.go:746 component=virt-handler namespace=kubevirt-test-default name=testvmi6vvbd kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-07-31T15:21:43.323668Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6vvbd, existing: false\n" level=info timestamp=2018-07-31T15:21:43.323891Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T15:21:43.324191Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6vvbd kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T15:21:43.326117Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6vvbd kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T15:21:43.327124Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6vvbd, existing: false\n" level=info timestamp=2018-07-31T15:21:43.327415Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T15:21:43.327608Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6vvbd kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T15:21:43.328353Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6vvbd kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-07-31T15:21:43.787548Z pos=vm.go:313 component=virt-handler msg="Processing vmi testvmi6vvbd, existing: false\n" level=info timestamp=2018-07-31T15:21:43.787815Z pos=vm.go:329 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-07-31T15:21:43.788648Z pos=vm.go:413 component=virt-handler namespace=kubevirt-test-default name=testvmi6vvbd kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-07-31T15:21:43.790134Z pos=vm.go:440 component=virt-handler namespace=kubevirt-test-default name=testvmi6vvbd kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmihfp49-2ztmm Pod phase: Running level=info timestamp=2018-07-31T15:21:42.286076Z pos=client.go:136 component=virt-launcher msg="Libvirt event 0 with reason 0 received" level=info timestamp=2018-07-31T15:21:42.955997Z pos=client.go:119 component=virt-launcher msg="domain status: 3:11" level=info timestamp=2018-07-31T15:21:42.975141Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T15:21:43.214379Z pos=client.go:136 component=virt-launcher msg="Libvirt event 4 with reason 0 received" level=info timestamp=2018-07-31T15:21:43.232592Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T15:21:43.236688Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T15:21:43.256505Z pos=client.go:136 component=virt-launcher msg="Libvirt event 2 with reason 0 received" level=info timestamp=2018-07-31T15:21:43.261293Z pos=manager.go:189 component=virt-launcher namespace=kubevirt-test-default name=testvmihfp49 kind= uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="Domain started." level=info timestamp=2018-07-31T15:21:43.263611Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmihfp49 kind= uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T15:21:43.283983Z pos=virt-launcher.go:217 component=virt-launcher msg="Detected domain with UUID 6ce7319d-ddd9-40a8-9464-1dccc24d9b89" level=info timestamp=2018-07-31T15:21:43.284670Z pos=monitor.go:253 component=virt-launcher msg="Monitoring loop: rate 1s start timeout 5m0s" level=info timestamp=2018-07-31T15:21:43.284919Z pos=client.go:119 component=virt-launcher msg="domain status: 1:1" level=info timestamp=2018-07-31T15:21:43.287136Z pos=client.go:145 component=virt-launcher msg="processed event" level=info timestamp=2018-07-31T15:21:43.336486Z pos=server.go:74 component=virt-launcher namespace=kubevirt-test-default name=testvmihfp49 kind= uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="Synced vmi" level=info timestamp=2018-07-31T15:21:44.301364Z pos=monitor.go:222 component=virt-launcher msg="Found PID for 6ce7319d-ddd9-40a8-9464-1dccc24d9b89: 160" Pod name: vmi-killerxrtv8 Pod phase: Succeeded • Failure [88.079 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Killed VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:997 should be in Failed phase [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:998 Expected : Running to equal : Failed /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:1021 ------------------------------ STEP: Starting a VirtualMachineInstance level=info timestamp=2018-07-31T15:21:26.008865Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmihfp49 kind=VirtualMachineInstance uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmihfp49-2ztmm" level=info timestamp=2018-07-31T15:21:41.615778Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmihfp49 kind=VirtualMachineInstance uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmihfp49-2ztmm" level=info timestamp=2018-07-31T15:21:42.911002Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmihfp49 kind=VirtualMachineInstance uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-31T15:21:42.925420Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmihfp49 kind=VirtualMachineInstance uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="VirtualMachineInstance started." STEP: Killing the VirtualMachineInstance level=info timestamp=2018-07-31T15:21:53.265240Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmihfp49 kind=VirtualMachineInstance uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmihfp49-2ztmm" level=info timestamp=2018-07-31T15:21:53.265380Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmihfp49 kind=VirtualMachineInstance uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmihfp49-2ztmm" level=info timestamp=2018-07-31T15:21:53.266079Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmihfp49 kind=VirtualMachineInstance uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="VirtualMachineInstance defined." level=info timestamp=2018-07-31T15:21:53.266212Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmihfp49 kind=VirtualMachineInstance uid=63ca6f63-94d5-11e8-bcd5-525500d15501 msg="VirtualMachineInstance started." STEP: Checking that the VirtualMachineInstance has 'Failed' phase • [SLOW TEST:83.556 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Killed VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:997 should be left alone by virt-handler /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:1025 ------------------------------ 2018/07/31 11:25:26 read closing down: EOF 2018/07/31 11:25:36 read closing down: EOF 2018/07/31 11:25:47 read closing down: EOF 2018/07/31 11:25:58 read closing down: EOF 2018/07/31 11:25:59 read closing down: EOF 2018/07/31 11:26:01 read closing down: EOF 2018/07/31 11:26:02 read closing down: EOF • [SLOW TEST:105.285 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be able to reach /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 the Inbound VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ 2018/07/31 11:26:02 read closing down: EOF 2018/07/31 11:26:04 read closing down: EOF 2018/07/31 11:26:05 read closing down: EOF •2018/07/31 11:26:05 read closing down: EOF 2018/07/31 11:26:08 read closing down: EOF 2018/07/31 11:26:09 read closing down: EOF •2018/07/31 11:26:09 read closing down: EOF 2018/07/31 11:26:11 read closing down: EOF 2018/07/31 11:26:12 read closing down: EOF •2018/07/31 11:26:12 read closing down: EOF ------------------------------ • [SLOW TEST:5.192 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be reachable via the propagated IP from a Pod /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 on the same node from Pod /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ ••• ------------------------------ • [SLOW TEST:5.457 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 with a service matching the vmi exposed /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:272 should be able to reach the vmi based on labels specified on the vmi /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:292 ------------------------------ • ------------------------------ • [SLOW TEST:5.281 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 with a subdomain and a headless service given /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:319 should be able to reach the vmi via its unique fully qualified domain name /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:342 ------------------------------ 2018/07/31 11:27:19 read closing down: EOF • [SLOW TEST:38.260 seconds] 2018/07/31 11:27:20 read closing down: EOF Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with custom interface model /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:368 should expose the right device type to the guest /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:369 ------------------------------ 2018/07/31 11:27:20 read closing down: EOF •2018/07/31 11:27:21 read closing down: EOF 2018/07/31 11:27:55 read closing down: EOF 2018/07/31 11:27:56 read closing down: EOF ------------------------------ • [SLOW TEST:34.717 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with custom MAC address /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:402 should configure custom MAC address /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:403 ------------------------------ 2018/07/31 11:28:30 read closing down: EOF 2018/07/31 11:28:31 read closing down: EOF • [SLOW TEST:34.908 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with custom MAC address in non-conventional format /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:414 should configure custom MAC address /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:415 ------------------------------ 2018/07/31 11:29:09 read closing down: EOF 2018/07/31 11:29:10 read closing down: EOF • [SLOW TEST:38.631 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with custom MAC address and slirp interface /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:427 should configure custom MAC address /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:428 ------------------------------ 2018/07/31 11:29:51 read closing down: EOF • [SLOW TEST:41.939 seconds] 2018/07/31 11:29:51 read closing down: EOF Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with disabled automatic attachment of interfaces /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:440 should not configure any external interfaces /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:441 ------------------------------ •••••••••••panic: test timed out after 1h30m0s goroutine 8021 [running]: testing.(*M).startAlarm.func1() /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1240 +0xfc created by time.goFunc /gimme/.gimme/versions/go1.10.linux.amd64/src/time/sleep.go:172 +0x44 goroutine 1 [chan receive, 90 minutes]: testing.(*T).Run(0xc4204ad0e0, 0x139e775, 0x9, 0x1430cd0, 0x4801e6) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:825 +0x301 testing.runTests.func1(0xc4204acff0) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1063 +0x64 testing.tRunner(0xc4204acff0, 0xc4207e7df8) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:777 +0xd0 testing.runTests(0xc420406940, 0x1d32a50, 0x1, 0x1, 0x412009) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1061 +0x2c4 testing.(*M).Run(0xc4205b5980, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:978 +0x171 main.main() _testmain.go:44 +0x151 goroutine 5 [chan receive]: kubevirt.io/kubevirt/vendor/github.com/golang/glog.(*loggingT).flushDaemon(0x1d5e280) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/golang/glog/glog.go:879 +0x8b created by kubevirt.io/kubevirt/vendor/github.com/golang/glog.init.0 /root/go/src/kubevirt.io/kubevirt/vendor/github.com/golang/glog/glog.go:410 +0x203 goroutine 6 [syscall, 90 minutes]: os/signal.signal_recv(0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/sigqueue.go:139 +0xa6 os/signal.loop() /gimme/.gimme/versions/go1.10.linux.amd64/src/os/signal/signal_unix.go:22 +0x22 created by os/signal.init.0 /gimme/.gimme/versions/go1.10.linux.amd64/src/os/signal/signal_unix.go:28 +0x41 goroutine 20 [select]: kubevirt.io/kubevirt/vendor/github.com/google/goexpect.(*GExpect).ExpectSwitchCase(0xc4204ac1e0, 0xc420fc8f40, 0x1, 0x1, 0x1bf08eb000, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/google/goexpect/expect.go:728 +0xc7d kubevirt.io/kubevirt/vendor/github.com/google/goexpect.(*GExpect).Expect(0xc4204ac1e0, 0xc4210cb360, 0x1bf08eb000, 0x0, 0x0, 0x1432ff0, 0xc4204bbf00, 0x3e, 0xc420fc8fb8, 0xc420fc8ff0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/google/goexpect/expect.go:1123 +0xdb kubevirt.io/kubevirt/vendor/github.com/google/goexpect.(*GExpect).ExpectBatch(0xc4204ac1e0, 0xc420aec3e0, 0x1, 0x1, 0x1bf08eb000, 0x0, 0xc420a51601, 0x14d35a0, 0xc4204ac1e0, 0xc420f3b560) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/google/goexpect/expect.go:565 +0x4f7 kubevirt.io/kubevirt/tests_test.glob..func2.2(0xc420addb80, 0x13adfdc, 0x16) /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:59 +0x41c kubevirt.io/kubevirt/tests_test.glob..func2.3.1.1.1() /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:70 +0x98 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*runner).runSync(0xc42045d200, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/runner.go:113 +0x9c kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*runner).run(0xc42045d200, 0x3, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/runner.go:64 +0x13e kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*ItNode).Run(0xc420459780, 0x14b6ce0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/it_node.go:26 +0x7f kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec.(*Spec).runSample(0xc420541860, 0x0, 0x14b6ce0, 0xc4200c34c0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec/spec.go:203 +0x648 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec.(*Spec).Run(0xc420541860, 0x14b6ce0, 0xc4200c34c0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec/spec.go:138 +0xff kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).runSpec(0xc4207b9040, 0xc420541860, 0x0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:200 +0x10d kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).runSpecs(0xc4207b9040, 0x1) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:170 +0x329 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).Run(0xc4207b9040, 0xb) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:66 +0x11b kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/suite.(*Suite).Run(0xc4200eaaf0, 0x7fd580e1f1c8, 0xc4204ad0e0, 0x13a0d58, 0xb, 0xc420406a40, 0x2, 0x2, 0x14d35e0, 0xc4200c34c0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/suite/suite.go:62 +0x27c kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo.RunSpecsWithCustomReporters(0x14b7d40, 0xc4204ad0e0, 0x13a0d58, 0xb, 0xc4204069c0, 0x2, 0x2, 0x2) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/ginkgo_dsl.go:221 +0x258 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo.RunSpecsWithDefaultAndCustomReporters(0x14b7d40, 0xc4204ad0e0, 0x13a0d58, 0xb, 0xc420550ee0, 0x1, 0x1, 0x1) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/ginkgo_dsl.go:209 +0xab kubevirt.io/kubevirt/tests_test.TestTests(0xc4204ad0e0) /root/go/src/kubevirt.io/kubevirt/tests/tests_suite_test.go:43 +0xaa testing.tRunner(0xc4204ad0e0, 0x1430cd0) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:777 +0xd0 created by testing.(*T).Run /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:824 +0x2e0 goroutine 21 [chan receive, 90 minutes]: kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).registerForInterrupts(0xc4207b9040, 0xc4200ddc80) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:223 +0xd1 created by kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).Run /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:60 +0x88 goroutine 22 [select, 90 minutes, locked to thread]: runtime.gopark(0x1432ea8, 0x0, 0x139b297, 0x6, 0x18, 0x1) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/proc.go:291 +0x11a runtime.selectgo(0xc42047d750, 0xc4200ddd40) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/select.go:392 +0xe50 runtime.ensureSigM.func1() /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/signal_unix.go:549 +0x1f4 runtime.goexit() /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/asm_amd64.s:2361 +0x1 goroutine 1524 [IO wait]: internal/poll.runtime_pollWait(0x7fd580f03f00, 0x72, 0xc420c4f850) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/netpoll.go:173 +0x57 internal/poll.(*pollDesc).wait(0xc4207ead98, 0x72, 0xffffffffffffff00, 0x14b8f00, 0x1c497d0) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_poll_runtime.go:85 +0x9b internal/poll.(*pollDesc).waitRead(0xc4207ead98, 0xc420ff8000, 0x8000, 0x8000) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_poll_runtime.go:90 +0x3d internal/poll.(*FD).Read(0xc4207ead80, 0xc420ff8000, 0x8000, 0x8000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_unix.go:157 +0x17d net.(*netFD).Read(0xc4207ead80, 0xc420ff8000, 0x8000, 0x8000, 0x0, 0x8, 0x7ffb) /gimme/.gimme/versions/go1.10.linux.amd64/src/net/fd_unix.go:202 +0x4f net.(*conn).Read(0xc420f6a060, 0xc420ff8000, 0x8000, 0x8000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/net/net.go:176 +0x6a crypto/tls.(*block).readFromUntil(0xc4206a74d0, 0x7fd580eb0a20, 0xc420f6a060, 0x5, 0xc420f6a060, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:493 +0x96 crypto/tls.(*Conn).readRecord(0xc420951880, 0x1432f17, 0xc4209519a0, 0x20) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:595 +0xe0 crypto/tls.(*Conn).Read(0xc420951880, 0xc420811000, 0x1000, 0x1000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:1156 +0x100 bufio.(*Reader).Read(0xc420a83d40, 0xc420cfe3b8, 0x9, 0x9, 0xc4217d5618, 0xc420a5dc80, 0xc420c4fd10) /gimme/.gimme/versions/go1.10.linux.amd64/src/bufio/bufio.go:216 +0x238 io.ReadAtLeast(0x14b5ae0, 0xc420a83d40, 0xc420cfe3b8, 0x9, 0x9, 0x9, 0xc420c4fce0, 0xc420c4fce0, 0x406614) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/io.go:309 +0x86 io.ReadFull(0x14b5ae0, 0xc420a83d40, 0xc420cfe3b8, 0x9, 0x9, 0xc4217d55c0, 0xc420c4fd10, 0xc400002701) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/io.go:327 +0x58 kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.readFrameHeader(0xc420cfe3b8, 0x9, 0x9, 0x14b5ae0, 0xc420a83d40, 0x0, 0xc400000000, 0x7ef9ad, 0xc420c4ffb0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/frame.go:237 +0x7b kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*Framer).ReadFrame(0xc420cfe380, 0xc420931cb0, 0x0, 0x0, 0x0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/frame.go:492 +0xa4 kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*clientConnReadLoop).run(0xc420c4ffb0, 0x1431c28, 0xc42069afb0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:1428 +0x8e kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*ClientConn).readLoop(0xc4200f2000) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:1354 +0x76 created by kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*Transport).newClientConn /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:579 +0x651 goroutine 1030 [chan send, 78 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc4203cb9b0) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 1764 [chan send, 67 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc42082b440) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 1178 [chan send, 77 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc42076d950) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 766 [chan send, 81 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc420952720) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 7995 [chan receive]: kubevirt.io/kubevirt/tests.NewConsoleExpecter.func2(0xc400000010, 0xc4205cea40) /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1191 +0x3c kubevirt.io/kubevirt/vendor/github.com/google/goexpect.(*GExpect).waitForSession(0xc4204ac1e0, 0xc420f3b560, 0xc420aec370, 0x14ba7c0, 0xc4204eea68, 0x14b5ec0, 0xc4204eea78, 0x0, 0x0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/google/goexpect/expect.go:1044 +0x16e created by kubevirt.io/kubevirt/vendor/github.com/google/goexpect.SpawnGeneric /root/go/src/kubevirt.io/kubevirt/vendor/github.com/google/goexpect/expect.go:805 +0x299 goroutine 8020 [chan receive]: io.(*pipe).Write(0xc420b0eff0, 0xc420cac000, 0x1, 0x8000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/pipe.go:89 +0x215 io.(*PipeWriter).Write(0xc4204eea98, 0xc420cac000, 0x1, 0x8000, 0x1, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/pipe.go:153 +0x4c io.copyBuffer(0x14b5ee0, 0xc4204eea98, 0x14b5f80, 0xc4204eec10, 0xc420cac000, 0x8000, 0x8000, 0xc4200ddf80, 0xe14403, 0x1) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/io.go:402 +0x240 io.Copy(0x14b5ee0, 0xc4204eea98, 0x14b5f80, 0xc4204eec10, 0x0, 0xc420dad7c0, 0x7e5620) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/io.go:362 +0x5a kubevirt.io/kubevirt/pkg/kubecli.(*wsStreamer).Stream.func2(0x14b5ec0, 0xc4204eea58, 0x14b5ee0, 0xc4204eea98, 0xc4204eec10, 0xc420f7b260) /root/go/src/kubevirt.io/kubevirt/pkg/kubecli/vmi.go:238 +0x4b created by kubevirt.io/kubevirt/pkg/kubecli.(*wsStreamer).Stream /root/go/src/kubevirt.io/kubevirt/pkg/kubecli/vmi.go:237 +0x137 goroutine 7994 [chan receive]: kubevirt.io/kubevirt/pkg/kubecli.(*wsStreamer).Stream(0xc420aec360, 0x14b5ec0, 0xc4204eea58, 0x14b5ee0, 0xc4204eea98, 0x0, 0x0) /root/go/src/kubevirt.io/kubevirt/pkg/kubecli/vmi.go:243 +0x17d kubevirt.io/kubevirt/tests.NewConsoleExpecter.func1(0xc420b48240, 0x14b6000, 0xc420aec360, 0xc4204eea58, 0xc4204eea98) /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1181 +0x61 created by kubevirt.io/kubevirt/tests.NewConsoleExpecter /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1180 +0x401 goroutine 7993 [chan receive]: kubevirt.io/kubevirt/pkg/kubecli.(*asyncWSRoundTripper).WebsocketCallback(0xc420e7f770, 0xc420d62780, 0xc420745200, 0x0, 0x0, 0x18, 0xc421077ec8) /root/go/src/kubevirt.io/kubevirt/pkg/kubecli/vmi.go:163 +0x32b kubevirt.io/kubevirt/pkg/kubecli.(*asyncWSRoundTripper).WebsocketCallback-fm(0xc420d62780, 0xc420745200, 0x0, 0x0, 0xc420d62780, 0xc420745200) /root/go/src/kubevirt.io/kubevirt/pkg/kubecli/vmi.go:313 +0x52 kubevirt.io/kubevirt/pkg/kubecli.(*WebsocketRoundTripper).RoundTrip(0xc420e7fb90, 0xc4210ad400, 0x0, 0x0, 0x0) /root/go/src/kubevirt.io/kubevirt/pkg/kubecli/vmi.go:142 +0xab kubevirt.io/kubevirt/pkg/kubecli.(*vmis).asyncSubresourceHelper.func1(0x14b5fc0, 0xc420e7fb90, 0xc4210ad400, 0xc420f3b260) /root/go/src/kubevirt.io/kubevirt/pkg/kubecli/vmi.go:328 +0x56 created by kubevirt.io/kubevirt/pkg/kubecli.(*vmis).asyncSubresourceHelper /root/go/src/kubevirt.io/kubevirt/pkg/kubecli/vmi.go:326 +0x33a goroutine 5048 [chan send, 31 minutes]: kubevirt.io/kubevirt/tests_test.glob..func23.1.2.1.1(0x14f12e0, 0xc420579e80, 0xc420144920, 0xc420fa7440, 0xc420144ee8, 0xc420144f48) /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:81 +0x138 created by kubevirt.io/kubevirt/tests_test.glob..func23.1.2.1 /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:73 +0x386 goroutine 8006 [runnable]: io.(*pipe).Read(0xc420b0eff0, 0xc420b2c000, 0x2000, 0x2000, 0xc420b2c000, 0xa1508c, 0x1) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/pipe.go:50 +0x115 io.(*PipeReader).Read(0xc4204eea78, 0xc420b2c000, 0x2000, 0x2000, 0x1, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/pipe.go:127 +0x4c kubevirt.io/kubevirt/vendor/github.com/google/goexpect.(*GExpect).waitForSession.func2(0x14b5ec0, 0xc4204eea78) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/google/goexpect/expect.go:1020 +0xdb created by kubevirt.io/kubevirt/vendor/github.com/google/goexpect.(*GExpect).waitForSession /root/go/src/kubevirt.io/kubevirt/vendor/github.com/google/goexpect/expect.go:1039 +0x154 goroutine 4104 [chan send, 42 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc42084d7a0) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 8005 [select]: kubevirt.io/kubevirt/vendor/github.com/google/goexpect.(*GExpect).waitForSession.func1(0xc420638fb0, 0xc420f73aa0, 0xc4204ac1e0, 0x14ba7c0, 0xc4204eea68) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/google/goexpect/expect.go:1001 +0x109 created by kubevirt.io/kubevirt/vendor/github.com/google/goexpect.(*GExpect).waitForSession /root/go/src/kubevirt.io/kubevirt/vendor/github.com/google/goexpect/expect.go:998 +0xc3 goroutine 4302 [chan send, 39 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc4208f6150) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 8019 [select]: io.(*pipe).Read(0xc420b0efa0, 0xc421126000, 0x8000, 0x8000, 0x11a0e20, 0x1, 0xc421126000) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/pipe.go:50 +0x115 io.(*PipeReader).Read(0xc4204eea58, 0xc421126000, 0x8000, 0x8000, 0x8000, 0x8000, 0xfb4669) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/pipe.go:127 +0x4c io.copyBuffer(0x14b5fa0, 0xc4204eec10, 0x14b5ec0, 0xc4204eea58, 0xc421126000, 0x8000, 0x8000, 0xc4212f02e0, 0xc420d77300, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/io.go:400 +0x164 io.Copy(0x14b5fa0, 0xc4204eec10, 0x14b5ec0, 0xc4204eea58, 0x2000, 0xc420ed37c0, 0x7e5620) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/io.go:362 +0x5a kubevirt.io/kubevirt/pkg/kubecli.(*wsStreamer).Stream.func1(0xc4204eec10, 0x14b5ec0, 0xc4204eea58, 0x14b5ee0, 0xc4204eea98, 0xc420f7b260) /root/go/src/kubevirt.io/kubevirt/pkg/kubecli/vmi.go:233 +0x4b created by kubevirt.io/kubevirt/pkg/kubecli.(*wsStreamer).Stream /root/go/src/kubevirt.io/kubevirt/pkg/kubecli/vmi.go:232 +0xda make: *** [functest] Error 2 + make cluster-down ./cluster/down.sh