+ export WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev + WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev + [[ k8s-1.11.0-dev =~ openshift-.* ]] + [[ k8s-1.11.0-dev =~ .*-1.10.4-.* ]] + export KUBEVIRT_PROVIDER=k8s-1.11.0 + KUBEVIRT_PROVIDER=k8s-1.11.0 + export KUBEVIRT_NUM_NODES=2 + KUBEVIRT_NUM_NODES=2 + export NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + export NAMESPACE=kube-system + NAMESPACE=kube-system + trap '{ make cluster-down; }' EXIT SIGINT SIGTERM SIGSTOP + make cluster-down ./cluster/down.sh + make cluster-up ./cluster/up.sh Downloading ....... Downloading ....... 2018/08/02 10:17:46 Waiting for host: 192.168.66.101:22 2018/08/02 10:17:49 Problem with dial: dial tcp 192.168.66.101:22: getsockopt: no route to host. Sleeping 5s 2018/08/02 10:17:57 Problem with dial: dial tcp 192.168.66.101:22: getsockopt: no route to host. Sleeping 5s 2018/08/02 10:18:02 Connected to tcp://192.168.66.101:22 ++ wc -l ++ grep active ++ systemctl status docker + [[ 0 -eq 0 ]] + sleep 2 ++ systemctl status docker ++ grep active ++ wc -l + [[ 1 -eq 0 ]] + kubeadm init --config /etc/kubernetes/kubeadm.conf [init] using Kubernetes version: v1.11.0 [preflight] running pre-flight checks I0802 10:18:05.662713 1266 feature_gate.go:230] feature gates: &{map[]} I0802 10:18:05.730075 1266 kernel_validator.go:81] Validating kernel version I0802 10:18:05.730303 1266 kernel_validator.go:96] Validating kernel config [preflight/images] Pulling images required for setting up a Kubernetes cluster [preflight/images] This might take a minute or two, depending on the speed of your internet connection [preflight/images] You can also perform this action in beforehand using 'kubeadm config images pull' [kubelet] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env" [kubelet] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml" [preflight] Activating the kubelet service [certificates] Generated ca certificate and key. [certificates] Generated apiserver certificate and key. [certificates] apiserver serving cert is signed for DNS names [node01 kubernetes kubernetes.default kubernetes.default.svc kubernetes.default.svc.cluster.local] and IPs [10.96.0.1 192.168.66.101] [certificates] Generated apiserver-kubelet-client certificate and key. [certificates] Generated sa key and public key. [certificates] Generated front-proxy-ca certificate and key. [certificates] Generated front-proxy-client certificate and key. [certificates] Generated etcd/ca certificate and key. [certificates] Generated etcd/server certificate and key. [certificates] etcd/server serving cert is signed for DNS names [node01 localhost] and IPs [127.0.0.1 ::1] [certificates] Generated etcd/peer certificate and key. [certificates] etcd/peer serving cert is signed for DNS names [node01 localhost] and IPs [192.168.66.101 127.0.0.1 ::1] [certificates] Generated etcd/healthcheck-client certificate and key. [certificates] Generated apiserver-etcd-client certificate and key. [certificates] valid certificates and keys now exist in "/etc/kubernetes/pki" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/admin.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/kubelet.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/controller-manager.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/scheduler.conf" [controlplane] wrote Static Pod manifest for component kube-apiserver to "/etc/kubernetes/manifests/kube-apiserver.yaml" [controlplane] wrote Static Pod manifest for component kube-controller-manager to "/etc/kubernetes/manifests/kube-controller-manager.yaml" [controlplane] wrote Static Pod manifest for component kube-scheduler to "/etc/kubernetes/manifests/kube-scheduler.yaml" [etcd] Wrote Static Pod manifest for a local etcd instance to "/etc/kubernetes/manifests/etcd.yaml" [init] waiting for the kubelet to boot up the control plane as Static Pods from directory "/etc/kubernetes/manifests" [init] this might take a minute or longer if the control plane images have to be pulled [apiclient] All control plane components are healthy after 48.503557 seconds [uploadconfig] storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace [kubelet] Creating a ConfigMap "kubelet-config-1.11" in namespace kube-system with the configuration for the kubelets in the cluster [markmaster] Marking the node node01 as master by adding the label "node-role.kubernetes.io/master=''" [markmaster] Marking the node node01 as master by adding the taints [node-role.kubernetes.io/master:NoSchedule] [patchnode] Uploading the CRI Socket information "/var/run/dockershim.sock" to the Node API object "node01" as an annotation [bootstraptoken] using token: abcdef.1234567890123456 [bootstraptoken] configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials [bootstraptoken] configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token [bootstraptoken] configured RBAC rules to allow certificate rotation for all node client certificates in the cluster [bootstraptoken] creating the "cluster-info" ConfigMap in the "kube-public" namespace [addons] Applied essential addon: CoreDNS [addons] Applied essential addon: kube-proxy Your Kubernetes master has initialized successfully! To start using your cluster, you need to run the following as a regular user: mkdir -p $HOME/.kube sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config sudo chown $(id -u):$(id -g) $HOME/.kube/config You should now deploy a pod network to the cluster. Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at: https://kubernetes.io/docs/concepts/cluster-administration/addons/ You can now join any number of machines by running the following on each node as root: kubeadm join 192.168.66.101:6443 --token abcdef.1234567890123456 --discovery-token-ca-cert-hash sha256:48f9223dc682c381b843183def3a39280a23adc9eef563ab856fee3ae4da558e + kubectl --kubeconfig=/etc/kubernetes/admin.conf apply -f https://raw.githubusercontent.com/coreos/flannel/v0.9.1/Documentation/kube-flannel.yml clusterrole.rbac.authorization.k8s.io/flannel created clusterrolebinding.rbac.authorization.k8s.io/flannel created serviceaccount/flannel created configmap/kube-flannel-cfg created daemonset.extensions/kube-flannel-ds created + kubectl --kubeconfig=/etc/kubernetes/admin.conf taint nodes node01 node-role.kubernetes.io/master:NoSchedule- node/node01 untainted + kubectl --kubeconfig=/etc/kubernetes/admin.conf create -f /tmp/local-volume.yaml storageclass.storage.k8s.io/local created configmap/local-storage-config created clusterrolebinding.rbac.authorization.k8s.io/local-storage-provisioner-pv-binding created clusterrole.rbac.authorization.k8s.io/local-storage-provisioner-node-clusterrole created clusterrolebinding.rbac.authorization.k8s.io/local-storage-provisioner-node-binding created role.rbac.authorization.k8s.io/local-storage-provisioner-jobs-role created rolebinding.rbac.authorization.k8s.io/local-storage-provisioner-jobs-rolebinding created serviceaccount/local-storage-admin created daemonset.extensions/local-volume-provisioner created 2018/08/02 10:19:06 Waiting for host: 192.168.66.102:22 2018/08/02 10:19:09 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: no route to host. Sleeping 5s 2018/08/02 10:19:21 Connected to tcp://192.168.66.102:22 ++ systemctl status docker ++ grep active ++ wc -l + [[ 1 -eq 0 ]] + kubeadm join --token abcdef.1234567890123456 192.168.66.101:6443 --ignore-preflight-errors=all --discovery-token-unsafe-skip-ca-verification=true [preflight] running pre-flight checks [WARNING RequiredIPVSKernelModulesAvailable]: the IPVS proxier will not be used, because the following required kernel modules are not loaded: [ip_vs_wrr ip_vs_sh ip_vs ip_vs_rr] or no builtin kernel ipvs support: map[ip_vs_wrr:{} ip_vs_sh:{} nf_conntrack_ipv4:{} ip_vs:{} ip_vs_rr:{}] you can solve this problem with following methods: 1. Run 'modprobe -- ' to load missing kernel modules; 2. Provide the missing builtin kernel ipvs support I0802 10:19:21.829179 1265 kernel_validator.go:81] Validating kernel version I0802 10:19:21.829434 1265 kernel_validator.go:96] Validating kernel config [discovery] Trying to connect to API Server "192.168.66.101:6443" [discovery] Created cluster-info discovery client, requesting info from "https://192.168.66.101:6443" [discovery] Cluster info signature and contents are valid and no TLS pinning was specified, will use API Server "192.168.66.101:6443" [discovery] Successfully established connection with API Server "192.168.66.101:6443" [kubelet] Downloading configuration for the kubelet from the "kubelet-config-1.11" ConfigMap in the kube-system namespace [kubelet] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml" [kubelet] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env" [preflight] Activating the kubelet service [tlsbootstrap] Waiting for the kubelet to perform the TLS Bootstrap... [patchnode] Uploading the CRI Socket information "/var/run/dockershim.sock" to the Node API object "node02" as an annotation This node has joined the cluster: * Certificate signing request was sent to master and a response was received. * The Kubelet was informed of the new secure connection details. Run 'kubectl get nodes' on the master to see this node join the cluster. Sending file modes: C0755 38739968 kubectl Sending file modes: C0600 5454 admin.conf Cluster "kubernetes" set. Cluster "kubernetes" set. + set +e + kubectl get nodes --no-headers + cluster/kubectl.sh get nodes --no-headers node01 Ready master 38s v1.11.0 node02 Ready 15s v1.11.0 + kubectl_rc=0 + '[' 0 -ne 0 ']' ++ kubectl get nodes --no-headers ++ cluster/kubectl.sh get nodes --no-headers ++ grep NotReady + '[' -n '' ']' + set -e + echo 'Nodes are ready:' Nodes are ready: + kubectl get nodes + cluster/kubectl.sh get nodes NAME STATUS ROLES AGE VERSION node01 Ready master 39s v1.11.0 node02 Ready 16s v1.11.0 + make cluster-sync ./cluster/build.sh Building ... sha256:ceba12cbc33e4e37a707840478a630db561e2427b78c8c9f9cd6d0b73276ab32 go version go1.10 linux/amd64 Waiting for rsyncd to be ready. go version go1.10 linux/amd64 make[1]: Entering directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt' hack/dockerized "./hack/check.sh && KUBEVIRT_VERSION= ./hack/build-go.sh install " && ./hack/build-copy-artifacts.sh sha256:ceba12cbc33e4e37a707840478a630db561e2427b78c8c9f9cd6d0b73276ab32 go version go1.10 linux/amd64 go version go1.10 linux/amd64 find: '/root/go/src/kubevirt.io/kubevirt/_out/cmd': No such file or directory Compiling tests... compiled tests.test hack/build-docker.sh build Sending build context to Docker daemon 40.39 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-controller ---> Using cache ---> b4f3251c6468 Step 4/8 : WORKDIR /home/virt-controller ---> Using cache ---> 813752072d9d Step 5/8 : USER 1001 ---> Using cache ---> 88b3556f36b4 Step 6/8 : COPY virt-controller /usr/bin/virt-controller ---> a2499926bd99 Removing intermediate container 9ac9e4978a9b Step 7/8 : ENTRYPOINT /usr/bin/virt-controller ---> Running in a4fe5ee31753 ---> 5412acc4b51d Removing intermediate container a4fe5ee31753 Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.11.0-dev1" '' "virt-controller" '' ---> Running in 8734304ad9ad ---> 0a7985e262eb Removing intermediate container 8734304ad9ad Successfully built 0a7985e262eb Sending build context to Docker daemon 43.31 MB Step 1/10 : FROM kubevirt/libvirt:4.2.0 ---> 5f0bfe81a3e0 Step 2/10 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 09010a005182 Step 3/10 : RUN dnf -y install socat genisoimage util-linux libcgroup-tools ethtool net-tools sudo && dnf -y clean all && test $(id -u qemu) = 107 # make sure that the qemu user really is 107 ---> Using cache ---> fc9481693838 Step 4/10 : COPY virt-launcher /usr/bin/virt-launcher ---> a7ceee6d94f1 Removing intermediate container b4fd57446b49 Step 5/10 : COPY kubevirt-sudo /etc/sudoers.d/kubevirt ---> b6428607ddaa Removing intermediate container dd0b09a824bd Step 6/10 : RUN setcap CAP_NET_BIND_SERVICE=+eip /usr/bin/qemu-system-x86_64 ---> Running in 17d8a22542df  ---> 84d9995bd231 Removing intermediate container 17d8a22542df Step 7/10 : RUN mkdir -p /usr/share/kubevirt/virt-launcher ---> Running in 2b39fc270caf  ---> cafc29527891 Removing intermediate container 2b39fc270caf Step 8/10 : COPY entrypoint.sh libvirtd.sh sock-connector /usr/share/kubevirt/virt-launcher/ ---> 1dfafc4dd1f6 Removing intermediate container c4593eb6a7c8 Step 9/10 : ENTRYPOINT /usr/share/kubevirt/virt-launcher/entrypoint.sh ---> Running in d7107aefb8f1 ---> 966f8fc7521c Removing intermediate container d7107aefb8f1 Step 10/10 : LABEL "kubevirt-functional-tests-k8s-1.11.0-dev1" '' "virt-launcher" '' ---> Running in 2268bbbe7d3f ---> 52b575c00510 Removing intermediate container 2268bbbe7d3f Successfully built 52b575c00510 Sending build context to Docker daemon 41.74 MB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/5 : COPY virt-handler /usr/bin/virt-handler ---> 62abe82c47dc Removing intermediate container 0e2227b3a232 Step 4/5 : ENTRYPOINT /usr/bin/virt-handler ---> Running in c108fb24b64f ---> 0287eede0790 Removing intermediate container c108fb24b64f Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.11.0-dev1" '' "virt-handler" '' ---> Running in ec2a2eeb55cb ---> 23175799f685 Removing intermediate container ec2a2eeb55cb Successfully built 23175799f685 Sending build context to Docker daemon 38.81 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-api ---> Using cache ---> 3cff23506e80 Step 4/8 : WORKDIR /home/virt-api ---> Using cache ---> e94c5606b96b Step 5/8 : USER 1001 ---> Using cache ---> af16317199f5 Step 6/8 : COPY virt-api /usr/bin/virt-api ---> d4c70992aee7 Removing intermediate container bbd63d2fd243 Step 7/8 : ENTRYPOINT /usr/bin/virt-api ---> Running in dab89f009d2e ---> 49612c492d8a Removing intermediate container dab89f009d2e Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.11.0-dev1" '' "virt-api" '' ---> Running in 73b670ef9769 ---> a343971e6fa8 Removing intermediate container 73b670ef9769 Successfully built a343971e6fa8 Sending build context to Docker daemon 4.096 kB Step 1/7 : FROM fedora:28 ---> cc510acfcd70 Step 2/7 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/7 : ENV container docker ---> Using cache ---> aed3ca4ac3a3 Step 4/7 : RUN mkdir -p /images/custom /images/alpine && truncate -s 64M /images/custom/disk.img && curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /images/alpine/disk.img ---> Using cache ---> c7d0cf8fc982 Step 5/7 : ADD entrypoint.sh / ---> Using cache ---> 0393e5ee0c37 Step 6/7 : CMD /entrypoint.sh ---> Using cache ---> 23798f49dea3 Step 7/7 : LABEL "disks-images-provider" '' "kubevirt-functional-tests-k8s-1.11.0-dev1" '' ---> Using cache ---> 3ab858e83f5f Successfully built 3ab858e83f5f Sending build context to Docker daemon 2.56 kB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/5 : ENV container docker ---> Using cache ---> aed3ca4ac3a3 Step 4/5 : RUN dnf -y install procps-ng nmap-ncat && dnf -y clean all ---> Using cache ---> d8c990eaf575 Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.11.0-dev1" '' "vm-killer" '' ---> Using cache ---> fb39ada92bc4 Successfully built fb39ada92bc4 Sending build context to Docker daemon 5.12 kB Step 1/7 : FROM debian:sid ---> 68f33cf86aab Step 2/7 : MAINTAINER "David Vossel" \ ---> Using cache ---> 50fc79ebe51c Step 3/7 : ENV container docker ---> Using cache ---> b8e063496923 Step 4/7 : RUN apt-get update && apt-get install -y bash curl bzip2 qemu-utils && mkdir -p /disk && rm -rf /var/lib/apt/lists/* ---> Using cache ---> 8adb1572b35c Step 5/7 : ADD entry-point.sh / ---> Using cache ---> 8c0c5a52e4df Step 6/7 : CMD /entry-point.sh ---> Using cache ---> 1a4b838e5dee Step 7/7 : LABEL "kubevirt-functional-tests-k8s-1.11.0-dev1" '' "registry-disk-v1alpha" '' ---> Using cache ---> 2140dd571c2c Successfully built 2140dd571c2c Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33270/kubevirt/registry-disk-v1alpha:devel ---> 2140dd571c2c Step 2/4 : MAINTAINER "David Vossel" \ ---> Using cache ---> d03302a8d943 Step 3/4 : RUN curl https://download.cirros-cloud.net/0.4.0/cirros-0.4.0-x86_64-disk.img > /disk/cirros.img ---> Using cache ---> 43ba33b466fd Step 4/4 : LABEL "cirros-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.11.0-dev1" '' ---> Using cache ---> fcf6905c1d06 Successfully built fcf6905c1d06 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33270/kubevirt/registry-disk-v1alpha:devel ---> 2140dd571c2c Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 4a89770d4a13 Step 3/4 : RUN curl -g -L https://download.fedoraproject.org/pub/fedora/linux/releases/27/CloudImages/x86_64/images/Fedora-Cloud-Base-27-1.6.x86_64.qcow2 > /disk/fedora.qcow2 ---> Using cache ---> 3db226e217b1 Step 4/4 : LABEL "fedora-cloud-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.11.0-dev1" '' ---> Using cache ---> d0936d988e80 Successfully built d0936d988e80 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33270/kubevirt/registry-disk-v1alpha:devel ---> 2140dd571c2c Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 4a89770d4a13 Step 3/4 : RUN curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /disk/alpine.iso ---> Using cache ---> 5f6799836682 Step 4/4 : LABEL "alpine-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.11.0-dev1" '' ---> Using cache ---> 12c4e6a8d751 Successfully built 12c4e6a8d751 Sending build context to Docker daemon 35.59 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virtctl ---> Using cache ---> d74088d7a4fc Step 4/8 : WORKDIR /home/virtctl ---> Using cache ---> c8c857bf8d96 Step 5/8 : USER 1001 ---> Using cache ---> 36730a67b946 Step 6/8 : COPY subresource-access-test /subresource-access-test ---> 29b17b7041c3 Removing intermediate container a8d1d796791e Step 7/8 : ENTRYPOINT /subresource-access-test ---> Running in 686bad836adc ---> fa2843bc0c59 Removing intermediate container 686bad836adc Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.11.0-dev1" '' "subresource-access-test" '' ---> Running in c63bcc056d1d ---> 601898c29207 Removing intermediate container c63bcc056d1d Successfully built 601898c29207 Sending build context to Docker daemon 3.072 kB Step 1/9 : FROM fedora:28 ---> cc510acfcd70 Step 2/9 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/9 : ENV container docker ---> Using cache ---> aed3ca4ac3a3 Step 4/9 : RUN dnf -y install make git gcc && dnf -y clean all ---> Using cache ---> 6050b24a5d85 Step 5/9 : ENV GIMME_GO_VERSION 1.9.2 ---> Using cache ---> 0447d2178073 Step 6/9 : RUN mkdir -p /gimme && curl -sL https://raw.githubusercontent.com/travis-ci/gimme/master/gimme | HOME=/gimme bash >> /etc/profile.d/gimme.sh ---> Using cache ---> 291db82d955f Step 7/9 : ENV GOPATH "/go" GOBIN "/usr/bin" ---> Using cache ---> 793556477837 Step 8/9 : RUN mkdir -p /go && source /etc/profile.d/gimme.sh && go get github.com/masterzen/winrm-cli ---> Using cache ---> fd5c6e1f9461 Step 9/9 : LABEL "kubevirt-functional-tests-k8s-1.11.0-dev1" '' "winrmcli" '' ---> Using cache ---> 0e3c88ca2e2b Successfully built 0e3c88ca2e2b Sending build context to Docker daemon 36.8 MB Step 1/5 : FROM fedora:27 ---> 9110ae7f579f Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 71a8c548e503 Step 3/5 : COPY example-hook-sidecar /example-hook-sidecar ---> e46651159d9f Removing intermediate container 6d40888a2d83 Step 4/5 : ENTRYPOINT /example-hook-sidecar ---> Running in d70448828eae ---> 4ed4cb64749a Removing intermediate container d70448828eae Step 5/5 : LABEL "example-hook-sidecar" '' "kubevirt-functional-tests-k8s-1.11.0-dev1" '' ---> Running in 93966e904485 ---> 31c6ccefc055 Removing intermediate container 93966e904485 Successfully built 31c6ccefc055 hack/build-docker.sh push The push refers to a repository [localhost:33270/kubevirt/virt-controller] 8acb3ac2782b: Preparing b2f5abdac324: Preparing 891e1e4ef82a: Preparing b2f5abdac324: Pushed 8acb3ac2782b: Pushed 891e1e4ef82a: Pushed devel: digest: sha256:464faba3830b0561a0a08ec9ff039b55175b20e269796105f338de31379483d4 size: 949 The push refers to a repository [localhost:33270/kubevirt/virt-launcher] 6e9e294f5e38: Preparing 04359919c014: Preparing 11546f1f69f6: Preparing 09be8d0d8e7c: Preparing 75026db39478: Preparing 0b99c4111657: Preparing da38cf808aa5: Preparing b83399358a92: Preparing 186d8b3e4fd8: Preparing fa6154170bf5: Preparing 5eefb9960a36: Preparing 891e1e4ef82a: Preparing 0b99c4111657: Waiting b83399358a92: Waiting 5eefb9960a36: Waiting 186d8b3e4fd8: Waiting fa6154170bf5: Waiting 891e1e4ef82a: Waiting 04359919c014: Pushed 09be8d0d8e7c: Pushed 6e9e294f5e38: Pushed da38cf808aa5: Pushed b83399358a92: Pushed 11546f1f69f6: Pushed fa6154170bf5: Pushed 186d8b3e4fd8: Pushed 891e1e4ef82a: Mounted from kubevirt/virt-controller 75026db39478: Pushed 0b99c4111657: Pushed 5eefb9960a36: Pushed devel: digest: sha256:d547a68b61de3d8e5a8814ae60798c9602ce98888694d6611564ddb5313d8d74 size: 2828 The push refers to a repository [localhost:33270/kubevirt/virt-handler] 02003d1cbd45: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-launcher 02003d1cbd45: Pushed devel: digest: sha256:c1b9b6730679b46c27bc2bb93f7ffec4e92f86ffc15d90453bfe4a141e63d1e3 size: 741 The push refers to a repository [localhost:33270/kubevirt/virt-api] b8302a2be594: Preparing afd1d781e4d1: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-handler afd1d781e4d1: Pushed b8302a2be594: Pushed devel: digest: sha256:9dd0e8eb478a9d1630579c8be0130c465c095c2ce8cb9c9a7ace3be467764983 size: 948 The push refers to a repository [localhost:33270/kubevirt/disks-images-provider] dc0875c44573: Preparing 8fc77a44094f: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-api dc0875c44573: Pushed 8fc77a44094f: Pushed devel: digest: sha256:4adc82f44c4222228a7930ac68eb940135861f3b390df82fd493b34719b17395 size: 948 The push refers to a repository [localhost:33270/kubevirt/vm-killer] d1b69e768421: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/disks-images-provider d1b69e768421: Pushed devel: digest: sha256:9b400d02d2dbdfef19c1ee9350810756f6c48311b8da49e6039ea6c9741880c3 size: 740 The push refers to a repository [localhost:33270/kubevirt/registry-disk-v1alpha] 2a15632f54d4: Preparing 91a924e03d7c: Preparing 25edbec0eaea: Preparing 2a15632f54d4: Pushed 91a924e03d7c: Pushed 25edbec0eaea: Pushed devel: digest: sha256:ae9e3df36ecb8ff92369c1528e51201145f53db9cfec2a501af171073fd84c7f size: 948 The push refers to a repository [localhost:33270/kubevirt/cirros-registry-disk-demo] 0f38e1a82d29: Preparing 2a15632f54d4: Preparing 91a924e03d7c: Preparing 25edbec0eaea: Preparing 91a924e03d7c: Mounted from kubevirt/registry-disk-v1alpha 2a15632f54d4: Mounted from kubevirt/registry-disk-v1alpha 25edbec0eaea: Mounted from kubevirt/registry-disk-v1alpha 0f38e1a82d29: Pushed devel: digest: sha256:e83cc3d9bf9f3b13808d5188261f0084bd175e682d95bb97c9db361b46bcf357 size: 1160 The push refers to a repository [localhost:33270/kubevirt/fedora-cloud-registry-disk-demo] da380ddf255b: Preparing 2a15632f54d4: Preparing 91a924e03d7c: Preparing 25edbec0eaea: Preparing 2a15632f54d4: Mounted from kubevirt/cirros-registry-disk-demo 91a924e03d7c: Mounted from kubevirt/cirros-registry-disk-demo 25edbec0eaea: Mounted from kubevirt/cirros-registry-disk-demo da380ddf255b: Pushed devel: digest: sha256:9a614c3b8040fbe500365278ad22fe8cebae250cfb6ba1530a6720a8fdf2c09e size: 1161 The push refers to a repository [localhost:33270/kubevirt/alpine-registry-disk-demo] 0934ac235ed3: Preparing 2a15632f54d4: Preparing 91a924e03d7c: Preparing 25edbec0eaea: Preparing 91a924e03d7c: Mounted from kubevirt/fedora-cloud-registry-disk-demo 25edbec0eaea: Mounted from kubevirt/fedora-cloud-registry-disk-demo 2a15632f54d4: Mounted from kubevirt/fedora-cloud-registry-disk-demo 0934ac235ed3: Pushed devel: digest: sha256:5e9e64016614776b436c406762d17b93e3130a8cc871807185f08c900812cb8b size: 1160 The push refers to a repository [localhost:33270/kubevirt/subresource-access-test] 380f1ba5493d: Preparing 4052ce9d0aff: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/vm-killer 4052ce9d0aff: Pushed 380f1ba5493d: Pushed devel: digest: sha256:5b38061224108df7501351be1560d79063646e79aa964159be01a123c92a54a0 size: 948 The push refers to a repository [localhost:33270/kubevirt/winrmcli] 64ccc7ac4271: Preparing 4242962b50c3: Preparing 0e374d8c733e: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/subresource-access-test 64ccc7ac4271: Pushed 0e374d8c733e: Pushed 4242962b50c3: Pushed devel: digest: sha256:9839cdf62d05a776b476afdcbe481598849b063d8ca01f053ee8d2dc682fc124 size: 1165 The push refers to a repository [localhost:33270/kubevirt/example-hook-sidecar] 97ebddf8b934: Preparing 39bae602f753: Preparing 97ebddf8b934: Pushed 39bae602f753: Pushed devel: digest: sha256:a7e6ea6278425dbaf826ea971fc94005bb5346c84cb9aea6632e4159599da8d5 size: 740 make[1]: Leaving directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt' Done ./cluster/clean.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.11.0 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.11.0-dev ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.11.0-dev1 ++ job_prefix=kubevirt-functional-tests-k8s-1.11.0-dev1 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-180-g2b4f231 ++ KUBEVIRT_VERSION=v0.7.0-180-g2b4f231 + source cluster/k8s-1.11.0/provider.sh ++ set -e ++ image=k8s-1.11.0@sha256:6c1caf5559eb02a144bf606de37eb0194c06ace4d77ad4561459f3bde876151c ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.11.0 ++ source hack/config-default.sh source hack/config-k8s-1.11.0.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.11.0.sh ++ source hack/config-provider-k8s-1.11.0.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/cluster/k8s-1.11.0/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/cluster/k8s-1.11.0/.kubectl +++ docker_prefix=localhost:33270/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Cleaning up ...' Cleaning up ... + cluster/kubectl.sh get vmis --all-namespaces -o=custom-columns=NAME:.metadata.name,NAMESPACE:.metadata.namespace,FINALIZERS:.metadata.finalizers --no-headers + grep foregroundDeleteVirtualMachine + read p error: the server doesn't have a resource type "vmis" + _kubectl delete ds -l kubevirt.io -n kube-system --cascade=false --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=libvirt --force --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=virt-handler --force --grace-period 0 No resources found + namespaces=(default ${namespace}) + for i in '${namespaces[@]}' + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete deployment -l kubevirt.io No resources found + _kubectl -n default delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete rs -l kubevirt.io No resources found + _kubectl -n default delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete services -l kubevirt.io No resources found + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n default delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete secrets -l kubevirt.io No resources found + _kubectl -n default delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete pv -l kubevirt.io No resources found + _kubectl -n default delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete pvc -l kubevirt.io No resources found + _kubectl -n default delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete ds -l kubevirt.io No resources found + _kubectl -n default delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n default delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete pods -l kubevirt.io No resources found + _kubectl -n default delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n default delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete rolebinding -l kubevirt.io No resources found + _kubectl -n default delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete roles -l kubevirt.io No resources found + _kubectl -n default delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete clusterroles -l kubevirt.io No resources found + _kubectl -n default delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n default get crd offlinevirtualmachines.kubevirt.io ++ wc -l ++ export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig ++ cluster/k8s-1.11.0/.kubectl -n default get crd offlinevirtualmachines.kubevirt.io No resources found. Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + for i in '${namespaces[@]}' + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete deployment -l kubevirt.io No resources found + _kubectl -n kube-system delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete rs -l kubevirt.io No resources found + _kubectl -n kube-system delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete services -l kubevirt.io No resources found + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n kube-system delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete secrets -l kubevirt.io No resources found + _kubectl -n kube-system delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete pv -l kubevirt.io No resources found + _kubectl -n kube-system delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete pvc -l kubevirt.io No resources found + _kubectl -n kube-system delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete ds -l kubevirt.io No resources found + _kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n kube-system delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete pods -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete rolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete roles -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete clusterroles -l kubevirt.io No resources found + _kubectl -n kube-system delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io ++ wc -l ++ export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig ++ cluster/k8s-1.11.0/.kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io No resources found. Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + sleep 2 + echo Done Done ./cluster/deploy.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.11.0 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.11.0-dev ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.11.0-dev1 ++ job_prefix=kubevirt-functional-tests-k8s-1.11.0-dev1 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-180-g2b4f231 ++ KUBEVIRT_VERSION=v0.7.0-180-g2b4f231 + source cluster/k8s-1.11.0/provider.sh ++ set -e ++ image=k8s-1.11.0@sha256:6c1caf5559eb02a144bf606de37eb0194c06ace4d77ad4561459f3bde876151c ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.11.0 ++ source hack/config-default.sh source hack/config-k8s-1.11.0.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.11.0.sh ++ source hack/config-provider-k8s-1.11.0.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/cluster/k8s-1.11.0/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/cluster/k8s-1.11.0/.kubectl +++ docker_prefix=localhost:33270/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Deploying ...' Deploying ... + [[ -z k8s-1.11.0-dev ]] + [[ k8s-1.11.0-dev =~ .*-dev ]] + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/manifests/dev -R + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/manifests/dev -R serviceaccount/kubevirt-apiserver created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-apiserver created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-apiserver-auth-delegator created rolebinding.rbac.authorization.k8s.io/kubevirt-apiserver created role.rbac.authorization.k8s.io/kubevirt-apiserver created clusterrole.rbac.authorization.k8s.io/kubevirt-apiserver created clusterrole.rbac.authorization.k8s.io/kubevirt-controller created serviceaccount/kubevirt-controller created serviceaccount/kubevirt-privileged created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-controller created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-controller-cluster-admin created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-privileged-cluster-admin created clusterrole.rbac.authorization.k8s.io/kubevirt.io:admin created clusterrole.rbac.authorization.k8s.io/kubevirt.io:edit created clusterrole.rbac.authorization.k8s.io/kubevirt.io:view created clusterrole.rbac.authorization.k8s.io/kubevirt.io:default created clusterrolebinding.rbac.authorization.k8s.io/kubevirt.io:default created customresourcedefinition.apiextensions.k8s.io/virtualmachineinstancereplicasets.kubevirt.io created service/virt-api created deployment.extensions/virt-api created service/virt-controller created deployment.extensions/virt-controller created daemonset.extensions/virt-handler created customresourcedefinition.apiextensions.k8s.io/virtualmachines.kubevirt.io created customresourcedefinition.apiextensions.k8s.io/virtualmachineinstances.kubevirt.io created customresourcedefinition.apiextensions.k8s.io/virtualmachineinstancepresets.kubevirt.io created + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R persistentvolumeclaim/disk-alpine created persistentvolume/host-path-disk-alpine created persistentvolumeclaim/disk-custom created persistentvolume/host-path-disk-custom created daemonset.extensions/disks-images-provider created serviceaccount/kubevirt-testing created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-testing-cluster-admin created + [[ k8s-1.11.0 =~ os-* ]] + echo Done Done + namespaces=(kube-system default) + [[ kube-system != \k\u\b\e\-\s\y\s\t\e\m ]] + timeout=300 + sample=30 + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n kube-system --no-headers ++ cluster/kubectl.sh get pods -n kube-system --no-headers ++ grep -v Running + '[' -n 'virt-api-7d79975b94-crmbq 0/1 ContainerCreating 0 1s virt-controller-67dcdd8464-shxnk 0/1 ContainerCreating 0 1s virt-controller-67dcdd8464-v5j45 0/1 ContainerCreating 0 1s virt-handler-snkp5 0/1 ContainerCreating 0 1s virt-handler-znjs4 0/1 ContainerCreating 0 1s' ']' + echo 'Waiting for kubevirt pods to enter the Running state ...' Waiting for kubevirt pods to enter the Running state ... + kubectl get pods -n kube-system --no-headers + grep -v Running + cluster/kubectl.sh get pods -n kube-system --no-headers virt-api-7d79975b94-crmbq 0/1 ContainerCreating 0 1s virt-controller-67dcdd8464-shxnk 0/1 ContainerCreating 0 1s virt-controller-67dcdd8464-v5j45 0/1 ContainerCreating 0 1s virt-handler-snkp5 0/1 ContainerCreating 0 1s virt-handler-znjs4 0/1 ContainerCreating 0 1s + sleep 30 + current_time=30 + '[' 30 -gt 300 ']' ++ kubectl get pods -n kube-system --no-headers ++ grep -v Running ++ cluster/kubectl.sh get pods -n kube-system --no-headers + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n '' ']' + kubectl get pods -n kube-system + cluster/kubectl.sh get pods -n kube-system NAME READY STATUS RESTARTS AGE coredns-78fcdf6894-8n5v4 1/1 Running 0 7m coredns-78fcdf6894-gkz9s 1/1 Running 0 7m disks-images-provider-4qn9k 1/1 Running 0 30s disks-images-provider-t2mkf 1/1 Running 0 31s etcd-node01 1/1 Running 0 6m kube-apiserver-node01 1/1 Running 0 7m kube-controller-manager-node01 1/1 Running 0 6m kube-flannel-ds-hbhcg 1/1 Running 0 7m kube-flannel-ds-ls9br 1/1 Running 0 7m kube-proxy-5jj2r 1/1 Running 0 7m kube-proxy-bwd2c 1/1 Running 0 7m kube-scheduler-node01 1/1 Running 0 7m virt-api-7d79975b94-crmbq 1/1 Running 0 32s virt-controller-67dcdd8464-shxnk 1/1 Running 0 32s virt-controller-67dcdd8464-v5j45 1/1 Running 0 32s virt-handler-snkp5 1/1 Running 0 32s virt-handler-znjs4 1/1 Running 0 32s + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n default --no-headers ++ grep -v Running ++ cluster/kubectl.sh get pods -n default --no-headers + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ cluster/kubectl.sh get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false + '[' -n '' ']' + kubectl get pods -n default + cluster/kubectl.sh get pods -n default NAME READY STATUS RESTARTS AGE local-volume-provisioner-dqgs6 1/1 Running 0 7m local-volume-provisioner-zqnsv 1/1 Running 0 7m + kubectl version + cluster/kubectl.sh version Client Version: version.Info{Major:"1", Minor:"11", GitVersion:"v1.11.0", GitCommit:"91e7b4fd31fcd3d5f436da26c980becec37ceefe", GitTreeState:"clean", BuildDate:"2018-06-27T20:17:28Z", GoVersion:"go1.10.2", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"11", GitVersion:"v1.11.0", GitCommit:"91e7b4fd31fcd3d5f436da26c980becec37ceefe", GitTreeState:"clean", BuildDate:"2018-06-27T20:08:34Z", GoVersion:"go1.10.2", Compiler:"gc", Platform:"linux/amd64"} + ginko_params='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/junit.xml' + [[ k8s-1.11.0-dev =~ windows.* ]] + FUNC_TEST_ARGS='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/junit.xml' + make functest hack/dockerized "hack/build-func-tests.sh" sha256:ceba12cbc33e4e37a707840478a630db561e2427b78c8c9f9cd6d0b73276ab32 go version go1.10 linux/amd64 go version go1.10 linux/amd64 Compiling tests... compiled tests.test hack/functests.sh Running Suite: Tests Suite ========================== Random Seed: 1533205624 Will run 151 of 151 specs Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T10:26:37.536296Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:26:44.148296Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:26:44.154927Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:26:44 http: TLS handshake error from 10.244.1.1:50002: EOF level=info timestamp=2018-08-02T10:26:48.448828Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:26:48.449171Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:26:48.578912Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:26:54 http: TLS handshake error from 10.244.1.1:50060: EOF level=info timestamp=2018-08-02T10:26:58.587913Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:26:58.899162Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:27:03.040460Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:27:04 http: TLS handshake error from 10.244.1.1:50086: EOF 2018/08/02 10:27:14 http: TLS handshake error from 10.244.1.1:50118: EOF level=info timestamp=2018-08-02T10:27:18.531203Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:27:18.774396Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-shxnk Pod phase: Running level=info timestamp=2018-08-02T10:26:26.401367Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-08-02T10:26:26.401419Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer limitrangeInformer" level=info timestamp=2018-08-02T10:26:26.401437Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-08-02T10:26:26.401454Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-08-02T10:26:26.401478Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-08-02T10:26:26.401494Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-08-02T10:26:26.401508Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-08-02T10:26:26.401525Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-08-02T10:26:26.401596Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-02T10:26:26.405305Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T10:26:26.405361Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T10:26:26.405398Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T10:26:26.405477Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T10:27:04.640054Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:27:04.644075Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T10:26:27.524580Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:26:29.948299Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-02T10:26:29.962860Z pos=vm.go:212 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-02T10:26:29.965593Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-02T10:26:30.064245Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-02T10:26:30.182022Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" level=info timestamp=2018-08-02T10:26:30.184538Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T10:27:19.997089Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:27:19.997764Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:27:19.997837Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibwhxl" level=info timestamp=2018-08-02T10:27:20.158208Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: true\n" level=info timestamp=2018-08-02T10:27:20.158290Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:27:20.158312Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:27:20.158388Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:27:20.159178Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:27:20.159298Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibwhxl" level=info timestamp=2018-08-02T10:27:20.479565Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: true\n" level=info timestamp=2018-08-02T10:27:20.479656Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:27:20.479702Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:27:20.479850Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:27:20.481084Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:27:20.481479Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibwhxl" Pod name: virt-launcher-testvmibwhxl-mf6ft Pod phase: Running level=info timestamp=2018-08-02T10:27:08.604228Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:27:08.604499Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:27:08.606247Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:27:18.635294Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:27:18.665081Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmibwhxl" level=info timestamp=2018-08-02T10:27:18.666206Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:27:18.666360Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running 2018/08/02 10:29:24 http: TLS handshake error from 10.244.1.1:50208: EOF level=info timestamp=2018-08-02T10:29:28.558923Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:29:29.075643Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:29:33.366375Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:29:34 http: TLS handshake error from 10.244.1.1:50214: EOF 2018/08/02 10:29:44 http: TLS handshake error from 10.244.1.1:50220: EOF level=info timestamp=2018-08-02T10:29:48.788102Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:29:49.008069Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:29:54 http: TLS handshake error from 10.244.1.1:50226: EOF level=info timestamp=2018-08-02T10:29:58.601450Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:29:59.092223Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:30:03.419714Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:30:03.654601Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:30:03.655621Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:30:04 http: TLS handshake error from 10.244.1.1:50232: EOF Pod name: virt-controller-67dcdd8464-shxnk Pod phase: Running level=info timestamp=2018-08-02T10:26:26.401367Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-08-02T10:26:26.401419Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer limitrangeInformer" level=info timestamp=2018-08-02T10:26:26.401437Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-08-02T10:26:26.401454Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-08-02T10:26:26.401478Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-08-02T10:26:26.401494Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-08-02T10:26:26.401508Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-08-02T10:26:26.401525Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-08-02T10:26:26.401596Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-02T10:26:26.405305Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T10:26:26.405361Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T10:26:26.405398Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T10:26:26.405477Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T10:27:04.640054Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:27:04.644075Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T10:26:27.524580Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:26:29.948299Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-02T10:26:29.962860Z pos=vm.go:212 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-02T10:26:29.965593Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-02T10:26:30.064245Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-02T10:26:30.182022Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" level=info timestamp=2018-08-02T10:26:30.184538Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T10:28:00.820610Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:28:00.822306Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:28:00.822404Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibwhxl" level=info timestamp=2018-08-02T10:28:41.782603Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: true\n" level=info timestamp=2018-08-02T10:28:41.782738Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:28:41.782771Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:28:41.782865Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:28:41.783573Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:28:41.783724Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibwhxl" level=info timestamp=2018-08-02T10:30:03.703975Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: true\n" level=info timestamp=2018-08-02T10:30:03.704068Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:30:03.704108Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:03.704243Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:30:03.705043Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:30:03.705148Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibwhxl" Pod name: virt-launcher-testvmibwhxl-mf6ft Pod phase: Running level=info timestamp=2018-08-02T10:27:08.604228Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:27:08.604499Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:27:08.606247Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:27:18.635294Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:27:18.665081Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmibwhxl" level=info timestamp=2018-08-02T10:27:18.666206Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:27:18.666360Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.387 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 VirtualMachineInstance definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:55 with 3 CPU cores /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:56 should report 3 cpu cores under guest OS [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:62 Unexpected Warning event received: testvmibwhxl,99d59caf-963e-11e8-9cd7-525500d15501: Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ STEP: Starting a VirtualMachineInstance level=info timestamp=2018-08-02T10:27:05.422175Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Created virtual machine pod virt-launcher-testvmibwhxl-mf6ft" level=info timestamp=2018-08-02T10:27:20.435179Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmibwhxl-mf6ft" level=error timestamp=2018-08-02T10:27:20.457639Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid=99d59caf-963e-11e8-9cd7-525500d15501 reason="unexpected warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" • Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T10:29:33.366375Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:29:34 http: TLS handshake error from 10.244.1.1:50214: EOF 2018/08/02 10:29:44 http: TLS handshake error from 10.244.1.1:50220: EOF level=info timestamp=2018-08-02T10:29:48.788102Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:29:49.008069Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:29:54 http: TLS handshake error from 10.244.1.1:50226: EOF level=info timestamp=2018-08-02T10:29:58.601450Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:29:59.092223Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:30:03.419714Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:30:03.654601Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:30:03.655621Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:30:04 http: TLS handshake error from 10.244.1.1:50232: EOF 2018/08/02 10:30:14 http: TLS handshake error from 10.244.1.1:50238: EOF level=info timestamp=2018-08-02T10:30:18.839101Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:30:19.055772Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-shxnk Pod phase: Running level=info timestamp=2018-08-02T10:26:26.401508Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-08-02T10:26:26.401525Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-08-02T10:26:26.401596Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-02T10:26:26.405305Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T10:26:26.405361Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T10:26:26.405398Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T10:26:26.405477Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T10:27:04.640054Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:27:04.644075Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:05.097124Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisq672 kind= uid=056ca86f-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:30:05.098503Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisq672 kind= uid=056ca86f-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:06.350039Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:30:06.350230Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:06.478914Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:30:06.494741Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T10:26:27.524580Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:30:21.993110Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:30:21.995182Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:30:21.995283Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:30:22.001175Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmid9p86, existing: true\n" level=info timestamp=2018-08-02T10:30:22.001243Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:30:22.001266Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:22.001366Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:30:22.006917Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5366/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:30:22.007134Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5366/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:30:22.028865Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmid9p86, existing: true\n" level=info timestamp=2018-08-02T10:30:22.029019Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:30:22.029048Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:22.029130Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:30:22.029902Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:30:22.030494Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T10:30:03.705148Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibwhxl" level=info timestamp=2018-08-02T10:30:04.874355Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: true\n" level=info timestamp=2018-08-02T10:30:04.874426Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:30:04.874457Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:04.874592Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.893801Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:30:04.893904Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: true\n" level=info timestamp=2018-08-02T10:30:04.893924Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:30:04.893945Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:04.894005Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.894083Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:30:04.901275Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: false\n" level=info timestamp=2018-08-02T10:30:04.901299Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:04.901330Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.901386Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmid9p86-zjk7l Pod phase: Running level=info timestamp=2018-08-02T10:30:10.331535Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:30:10.331820Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:30:10.335585Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:30:20.347885Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:30:20.409578Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmid9p86" level=info timestamp=2018-08-02T10:30:20.411609Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:30:20.411758Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T10:32:19.048420Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:32:19.261658Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:32:24 http: TLS handshake error from 10.244.1.1:50316: EOF level=info timestamp=2018-08-02T10:32:28.597967Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:32:29.203079Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:32:33.690483Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:32:34 http: TLS handshake error from 10.244.1.1:50322: EOF 2018/08/02 10:32:44 http: TLS handshake error from 10.244.1.1:50328: EOF level=info timestamp=2018-08-02T10:32:49.098136Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:32:49.318773Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:32:54 http: TLS handshake error from 10.244.1.1:50334: EOF level=info timestamp=2018-08-02T10:32:58.559446Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:32:59.224294Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:33:03.745515Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:33:04 http: TLS handshake error from 10.244.1.1:50340: EOF Pod name: virt-controller-67dcdd8464-shxnk Pod phase: Running level=info timestamp=2018-08-02T10:26:26.401508Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-08-02T10:26:26.401525Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-08-02T10:26:26.401596Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-02T10:26:26.405305Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T10:26:26.405361Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T10:26:26.405398Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T10:26:26.405477Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T10:27:04.640054Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:27:04.644075Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:05.097124Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisq672 kind= uid=056ca86f-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:30:05.098503Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisq672 kind= uid=056ca86f-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:06.350039Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:30:06.350230Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:06.478914Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:30:06.494741Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T10:26:27.524580Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:31:02.966929Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:31:02.967767Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:31:02.967857Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:31:43.928095Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmid9p86, existing: true\n" level=info timestamp=2018-08-02T10:31:43.928178Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:31:43.928202Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:31:43.928301Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:31:43.929827Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:31:43.929928Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:33:05.850216Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmid9p86, existing: true\n" level=info timestamp=2018-08-02T10:33:05.850289Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:33:05.850312Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:33:05.850389Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:33:05.853225Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:33:05.853309Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T10:30:04.874592Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.893801Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:30:04.893904Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: true\n" level=info timestamp=2018-08-02T10:30:04.893924Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:30:04.893945Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:04.894005Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.894083Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:30:04.901275Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: false\n" level=info timestamp=2018-08-02T10:30:04.901299Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:04.901330Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.901386Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:32:47.545417Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: false\n" level=info timestamp=2018-08-02T10:32:47.545505Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:32:47.545575Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:32:47.545722Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmid9p86-zjk7l Pod phase: Running level=info timestamp=2018-08-02T10:30:10.331535Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:30:10.331820Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:30:10.335585Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:30:20.347885Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:30:20.409578Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmid9p86" level=info timestamp=2018-08-02T10:30:20.411609Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:30:20.411758Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure [180.353 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 VirtualMachineInstance definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:55 with hugepages /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:164 should consume hugepages /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 hugepages-2Mi [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Unexpected Warning event received: testvmid9p86,062bbeef-963f-11e8-9cd7-525500d15501: Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5369/ns/mnt Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ STEP: Starting a VM level=info timestamp=2018-08-02T10:30:07.085285Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmid9p86 kind=VirtualMachineInstance uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Created virtual machine pod virt-launcher-testvmid9p86-zjk7l" level=info timestamp=2018-08-02T10:30:22.355271Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmid9p86 kind=VirtualMachineInstance uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmid9p86-zjk7l" level=error timestamp=2018-08-02T10:30:22.390815Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmid9p86 kind=VirtualMachineInstance uid=062bbeef-963f-11e8-9cd7-525500d15501 reason="unexpected warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5369/ns/mnt" S [SKIPPING] [0.215 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 VirtualMachineInstance definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:55 with hugepages /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:164 should consume hugepages /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 hugepages-1Gi [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 No node with hugepages hugepages-1Gi capacity /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:216 ------------------------------ • Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T10:32:28.597967Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:32:29.203079Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:32:33.690483Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:32:34 http: TLS handshake error from 10.244.1.1:50322: EOF 2018/08/02 10:32:44 http: TLS handshake error from 10.244.1.1:50328: EOF level=info timestamp=2018-08-02T10:32:49.098136Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:32:49.318773Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:32:54 http: TLS handshake error from 10.244.1.1:50334: EOF level=info timestamp=2018-08-02T10:32:58.559446Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:32:59.224294Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:33:03.745515Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:33:04 http: TLS handshake error from 10.244.1.1:50340: EOF 2018/08/02 10:33:14 http: TLS handshake error from 10.244.1.1:50346: EOF level=info timestamp=2018-08-02T10:33:19.149542Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:33:19.372437Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-shxnk Pod phase: Running level=info timestamp=2018-08-02T10:27:04.640054Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:27:04.644075Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:05.097124Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisq672 kind= uid=056ca86f-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:30:05.098503Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisq672 kind= uid=056ca86f-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:06.350039Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:30:06.350230Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:06.478914Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:30:06.494741Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:33:06.732828Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7bb8 kind= uid=71b15de0-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:33:06.734048Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7bb8 kind= uid=71b15de0-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:33:06.767281Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:06.793414Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:07.782195Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmit7bb8, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 71b15de0-963f-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:07.960384Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:33:07.960541Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T10:26:27.524580Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:33:23.336150Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:33:23.336498Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:33:23.336566Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4bjrt" level=info timestamp=2018-08-02T10:33:23.356719Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4bjrt, existing: true\n" level=info timestamp=2018-08-02T10:33:23.356772Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:33:23.356795Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:33:23.356898Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:33:23.357610Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:33:23.357696Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4bjrt" level=info timestamp=2018-08-02T10:33:23.398383Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4bjrt, existing: true\n" level=info timestamp=2018-08-02T10:33:23.399242Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:33:23.399270Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:33:23.399349Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:33:23.400127Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5366/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:33:23.400214Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5366/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4bjrt" Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T10:30:04.874592Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.893801Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:30:04.893904Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: true\n" level=info timestamp=2018-08-02T10:30:04.893924Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:30:04.893945Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:04.894005Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.894083Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:30:04.901275Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: false\n" level=info timestamp=2018-08-02T10:30:04.901299Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:04.901330Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.901386Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:32:47.545417Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: false\n" level=info timestamp=2018-08-02T10:32:47.545505Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:32:47.545575Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:32:47.545722Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi4bjrt-mh6wt Pod phase: Running level=info timestamp=2018-08-02T10:33:11.867438Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:33:11.867758Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:33:11.869157Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:33:21.876452Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:33:21.899398Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi4bjrt" level=info timestamp=2018-08-02T10:33:21.901817Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:33:21.902096Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T10:35:19.356132Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:35:19.578765Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:35:24 http: TLS handshake error from 10.244.1.1:50424: EOF level=info timestamp=2018-08-02T10:35:28.599752Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:35:29.362107Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:35:34.013739Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:35:34 http: TLS handshake error from 10.244.1.1:50430: EOF 2018/08/02 10:35:44 http: TLS handshake error from 10.244.1.1:50436: EOF level=info timestamp=2018-08-02T10:35:49.404988Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:35:49.628305Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:35:54 http: TLS handshake error from 10.244.1.1:50442: EOF level=info timestamp=2018-08-02T10:35:58.590908Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:35:59.388016Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:36:04.067763Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:36:04 http: TLS handshake error from 10.244.1.1:50448: EOF Pod name: virt-controller-67dcdd8464-shxnk Pod phase: Running level=info timestamp=2018-08-02T10:27:04.640054Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:27:04.644075Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:05.097124Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisq672 kind= uid=056ca86f-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:30:05.098503Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisq672 kind= uid=056ca86f-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:06.350039Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:30:06.350230Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:06.478914Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:30:06.494741Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:33:06.732828Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7bb8 kind= uid=71b15de0-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:33:06.734048Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7bb8 kind= uid=71b15de0-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:33:06.767281Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:06.793414Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:07.782195Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmit7bb8, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 71b15de0-963f-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:07.960384Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:33:07.960541Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T10:26:27.524580Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:34:45.255985Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:34:45.256016Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:34:45.256102Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:34:45.256727Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5366/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:34:45.256818Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5366/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4bjrt" level=info timestamp=2018-08-02T10:35:49.693580Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmid9p86, existing: false\n" level=info timestamp=2018-08-02T10:35:49.693669Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:35:49.693761Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmid9p86 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:35:49.693864Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmid9p86 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:36:07.177079Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4bjrt, existing: true\n" level=info timestamp=2018-08-02T10:36:07.177176Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:36:07.177325Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:36:07.177512Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:36:07.178671Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5366/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:36:07.178777Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5366/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4bjrt" Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T10:30:04.874592Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.893801Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:30:04.893904Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: true\n" level=info timestamp=2018-08-02T10:30:04.893924Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:30:04.893945Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:04.894005Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.894083Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:30:04.901275Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: false\n" level=info timestamp=2018-08-02T10:30:04.901299Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:04.901330Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.901386Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:32:47.545417Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: false\n" level=info timestamp=2018-08-02T10:32:47.545505Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:32:47.545575Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:32:47.545722Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi4bjrt-mh6wt Pod phase: Running level=info timestamp=2018-08-02T10:33:11.867438Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:33:11.867758Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:33:11.869157Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:33:21.876452Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:33:21.899398Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi4bjrt" level=info timestamp=2018-08-02T10:33:21.901817Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:33:21.902096Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure in Spec Setup (BeforeEach) [180.334 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 with CPU spec /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:294 when CPU model defined [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:340 should report defined CPU model /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:341 Unexpected Warning event received: testvmi4bjrt,726d094e-963f-11e8-9cd7-525500d15501: Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5369/ns/mnt Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T10:33:08.655436Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi4bjrt kind=VirtualMachineInstance uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi4bjrt-mh6wt" level=info timestamp=2018-08-02T10:33:23.704325Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi4bjrt kind=VirtualMachineInstance uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi4bjrt-mh6wt" level=error timestamp=2018-08-02T10:33:23.726835Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi4bjrt kind=VirtualMachineInstance uid=726d094e-963f-11e8-9cd7-525500d15501 reason="unexpected warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5369/ns/mnt" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T10:35:28.599752Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:35:29.362107Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:35:34.013739Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:35:34 http: TLS handshake error from 10.244.1.1:50430: EOF 2018/08/02 10:35:44 http: TLS handshake error from 10.244.1.1:50436: EOF level=info timestamp=2018-08-02T10:35:49.404988Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:35:49.628305Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:35:54 http: TLS handshake error from 10.244.1.1:50442: EOF level=info timestamp=2018-08-02T10:35:58.590908Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:35:59.388016Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:36:04.067763Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:36:04 http: TLS handshake error from 10.244.1.1:50448: EOF 2018/08/02 10:36:14 http: TLS handshake error from 10.244.1.1:50454: EOF level=info timestamp=2018-08-02T10:36:19.459882Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:36:19.684352Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-shxnk Pod phase: Running level=info timestamp=2018-08-02T10:30:05.097124Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisq672 kind= uid=056ca86f-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:30:05.098503Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisq672 kind= uid=056ca86f-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:06.350039Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:30:06.350230Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:06.478914Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:30:06.494741Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:33:06.732828Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7bb8 kind= uid=71b15de0-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:33:06.734048Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7bb8 kind= uid=71b15de0-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:33:06.767281Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:06.793414Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:07.782195Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmit7bb8, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 71b15de0-963f-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:07.960384Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:33:07.960541Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:36:08.301514Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:36:08.301757Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T10:26:27.524580Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:36:23.570196Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:36:23.572104Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:36:23.572231Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmit44xz" level=info timestamp=2018-08-02T10:36:23.592727Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmit44xz, existing: true\n" level=info timestamp=2018-08-02T10:36:23.592808Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:36:23.592842Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:36:23.593054Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:36:23.593870Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:36:23.594037Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmit44xz" level=info timestamp=2018-08-02T10:36:23.634266Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmit44xz, existing: true\n" level=info timestamp=2018-08-02T10:36:23.634347Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:36:23.634374Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:36:23.634467Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:36:23.635170Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:36:23.635245Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmit44xz" Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T10:30:04.874592Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.893801Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:30:04.893904Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: true\n" level=info timestamp=2018-08-02T10:30:04.893924Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:30:04.893945Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:04.894005Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.894083Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:30:04.901275Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: false\n" level=info timestamp=2018-08-02T10:30:04.901299Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:04.901330Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.901386Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:32:47.545417Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: false\n" level=info timestamp=2018-08-02T10:32:47.545505Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:32:47.545575Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:32:47.545722Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmit44xz-fdm55 Pod phase: Running level=info timestamp=2018-08-02T10:36:12.277863Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:36:12.278701Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:36:12.280938Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:36:22.290843Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:36:22.338241Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmit44xz" level=info timestamp=2018-08-02T10:36:22.340117Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:36:22.340298Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T10:38:29.513718Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:38:34.334345Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:38:34 http: TLS handshake error from 10.244.1.1:50538: EOF 2018/08/02 10:38:44 http: TLS handshake error from 10.244.1.1:50544: EOF level=info timestamp=2018-08-02T10:38:49.718104Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:38:49.935747Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:38:54 http: TLS handshake error from 10.244.1.1:50550: EOF level=info timestamp=2018-08-02T10:38:58.528945Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:38:58.536283Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:38:58.622480Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:38:59.534000Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:39:03.731385Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:39:03.732369Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:39:04.388441Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:39:04 http: TLS handshake error from 10.244.1.1:50556: EOF Pod name: virt-controller-67dcdd8464-shxnk Pod phase: Running level=info timestamp=2018-08-02T10:30:05.097124Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisq672 kind= uid=056ca86f-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:30:05.098503Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisq672 kind= uid=056ca86f-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:06.350039Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:30:06.350230Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:06.478914Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:30:06.494741Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:33:06.732828Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7bb8 kind= uid=71b15de0-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:33:06.734048Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7bb8 kind= uid=71b15de0-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:33:06.767281Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:06.793414Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:07.782195Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmit7bb8, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 71b15de0-963f-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:07.960384Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:33:07.960541Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:36:08.301514Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:36:08.301757Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T10:26:27.524580Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:37:45.489479Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:37:45.489508Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:37:45.489591Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:37:45.490355Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:37:45.490442Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmit44xz" level=info timestamp=2018-08-02T10:38:51.019125Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4bjrt, existing: false\n" level=info timestamp=2018-08-02T10:38:51.019211Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:38:51.019287Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4bjrt kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:38:51.019396Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4bjrt kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:39:07.412395Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmit44xz, existing: true\n" level=info timestamp=2018-08-02T10:39:07.412529Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:39:07.412561Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:39:07.412647Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:39:07.414929Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:39:07.415061Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmit44xz" Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T10:30:04.874592Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.893801Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:30:04.893904Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: true\n" level=info timestamp=2018-08-02T10:30:04.893924Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:30:04.893945Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:04.894005Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.894083Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:30:04.901275Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: false\n" level=info timestamp=2018-08-02T10:30:04.901299Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:04.901330Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.901386Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:32:47.545417Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: false\n" level=info timestamp=2018-08-02T10:32:47.545505Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:32:47.545575Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:32:47.545722Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmit44xz-fdm55 Pod phase: Running level=info timestamp=2018-08-02T10:36:12.277863Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:36:12.278701Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:36:12.280938Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:36:22.290843Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:36:22.338241Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmit44xz" level=info timestamp=2018-08-02T10:36:22.340117Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:36:22.340298Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [180.342 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 with CPU spec /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:294 when CPU model equals to passthrough [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:368 should report exactly the same model as node CPU /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:369 Unexpected Warning event received: testvmit44xz,ddea8c8c-963f-11e8-9cd7-525500d15501: Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T10:36:09.000835Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmit44xz kind=VirtualMachineInstance uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Created virtual machine pod virt-launcher-testvmit44xz-fdm55" level=info timestamp=2018-08-02T10:36:23.928362Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmit44xz kind=VirtualMachineInstance uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmit44xz-fdm55" level=error timestamp=2018-08-02T10:36:23.961891Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmit44xz kind=VirtualMachineInstance uid=ddea8c8c-963f-11e8-9cd7-525500d15501 reason="unexpected warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running 2018/08/02 10:38:44 http: TLS handshake error from 10.244.1.1:50544: EOF level=info timestamp=2018-08-02T10:38:49.718104Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:38:49.935747Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:38:54 http: TLS handshake error from 10.244.1.1:50550: EOF level=info timestamp=2018-08-02T10:38:58.528945Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:38:58.536283Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:38:58.622480Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:38:59.534000Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:39:03.731385Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:39:03.732369Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:39:04.388441Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:39:04 http: TLS handshake error from 10.244.1.1:50556: EOF 2018/08/02 10:39:14 http: TLS handshake error from 10.244.1.1:50562: EOF level=info timestamp=2018-08-02T10:39:19.769104Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:39:19.992832Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-shxnk Pod phase: Running level=info timestamp=2018-08-02T10:30:06.350039Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:30:06.350230Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:06.478914Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:30:06.494741Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:33:06.732828Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7bb8 kind= uid=71b15de0-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:33:06.734048Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7bb8 kind= uid=71b15de0-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:33:06.767281Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:06.793414Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:07.782195Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmit7bb8, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 71b15de0-963f-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:07.960384Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:33:07.960541Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:36:08.301514Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:36:08.301757Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:39:08.648992Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:39:08.649146Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T10:26:27.524580Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:39:23.661617Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:39:23.662261Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5366/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:39:23.662360Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5366/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqzlkp" level=info timestamp=2018-08-02T10:39:23.682860Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: true\n" level=info timestamp=2018-08-02T10:39:23.682963Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:39:23.682998Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:39:23.683101Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:39:23.683702Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:39:23.683788Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqzlkp" level=info timestamp=2018-08-02T10:39:23.724016Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: true\n" level=info timestamp=2018-08-02T10:39:23.724312Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:39:23.724367Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:39:23.724466Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:39:23.726339Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:39:23.726565Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqzlkp" Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T10:30:04.874592Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.893801Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:30:04.893904Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: true\n" level=info timestamp=2018-08-02T10:30:04.893924Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:30:04.893945Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:04.894005Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.894083Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:30:04.901275Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: false\n" level=info timestamp=2018-08-02T10:30:04.901299Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:04.901330Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.901386Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:32:47.545417Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: false\n" level=info timestamp=2018-08-02T10:32:47.545505Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:32:47.545575Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:32:47.545722Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmiqzlkp-25mx9 Pod phase: Running level=info timestamp=2018-08-02T10:39:13.174845Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:39:13.175502Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:39:13.176938Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:39:23.184441Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:39:23.227179Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiqzlkp" level=info timestamp=2018-08-02T10:39:23.229426Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:39:23.229603Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running 2018/08/02 10:41:24 http: TLS handshake error from 10.244.1.1:50640: EOF level=info timestamp=2018-08-02T10:41:28.620348Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:41:29.642408Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:41:34.654601Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:41:34 http: TLS handshake error from 10.244.1.1:50646: EOF 2018/08/02 10:41:44 http: TLS handshake error from 10.244.1.1:50652: EOF level=info timestamp=2018-08-02T10:41:50.025012Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:41:50.247054Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:41:54 http: TLS handshake error from 10.244.1.1:50658: EOF level=info timestamp=2018-08-02T10:41:58.611960Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:41:59.665152Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:42:03.736439Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:42:03.737535Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:42:04.708485Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:42:04 http: TLS handshake error from 10.244.1.1:50664: EOF Pod name: virt-controller-67dcdd8464-shxnk Pod phase: Running level=info timestamp=2018-08-02T10:30:06.350039Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:30:06.350230Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid9p86 kind= uid=062bbeef-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:30:06.478914Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:30:06.494741Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:33:06.732828Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7bb8 kind= uid=71b15de0-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:33:06.734048Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7bb8 kind= uid=71b15de0-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:33:06.767281Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:06.793414Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:07.782195Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmit7bb8, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 71b15de0-963f-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:07.960384Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:33:07.960541Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:36:08.301514Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:36:08.301757Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:39:08.648992Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:39:08.649146Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T10:26:27.524580Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:40:45.584298Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:40:45.584321Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:40:45.584400Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:40:45.585736Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5366/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:40:45.585832Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5366/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqzlkp" level=info timestamp=2018-08-02T10:41:51.255326Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmit44xz, existing: false\n" level=info timestamp=2018-08-02T10:41:51.255411Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:41:51.255486Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmit44xz kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:41:51.255580Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmit44xz kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:07.506092Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: true\n" level=info timestamp=2018-08-02T10:42:07.506177Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:42:07.506208Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:07.506309Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:42:07.506892Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:42:07.507249Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqzlkp" Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T10:30:04.874592Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.893801Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:30:04.893904Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: true\n" level=info timestamp=2018-08-02T10:30:04.893924Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:30:04.893945Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:04.894005Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.894083Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind= uid=99d59caf-963e-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:30:04.901275Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: false\n" level=info timestamp=2018-08-02T10:30:04.901299Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:30:04.901330Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:30:04.901386Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:32:47.545417Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibwhxl, existing: false\n" level=info timestamp=2018-08-02T10:32:47.545505Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:32:47.545575Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:32:47.545722Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibwhxl kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmiqzlkp-25mx9 Pod phase: Running level=info timestamp=2018-08-02T10:39:13.174845Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:39:13.175502Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:39:13.176938Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:39:23.184441Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:39:23.227179Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiqzlkp" level=info timestamp=2018-08-02T10:39:23.229426Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:39:23.229603Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [180.322 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 with CPU spec /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:294 when CPU model not defined [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:392 should report CPU model from libvirt capabilities /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:393 Unexpected Warning event received: testvmiqzlkp,4968efff-9640-11e8-9cd7-525500d15501: Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T10:39:09.292388Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Created virtual machine pod virt-launcher-testvmiqzlkp-25mx9" level=info timestamp=2018-08-02T10:39:24.034064Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmiqzlkp-25mx9" level=error timestamp=2018-08-02T10:39:24.050534Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid=4968efff-9640-11e8-9cd7-525500d15501 reason="unexpected warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running 2018/08/02 10:41:34 http: TLS handshake error from 10.244.1.1:50646: EOF 2018/08/02 10:41:44 http: TLS handshake error from 10.244.1.1:50652: EOF level=info timestamp=2018-08-02T10:41:50.025012Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:41:50.247054Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:41:54 http: TLS handshake error from 10.244.1.1:50658: EOF level=info timestamp=2018-08-02T10:41:58.611960Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:41:59.665152Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:42:03.736439Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:42:03.737535Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:42:04.708485Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:42:04 http: TLS handshake error from 10.244.1.1:50664: EOF 2018/08/02 10:42:14 http: TLS handshake error from 10.244.1.1:50670: EOF level=info timestamp=2018-08-02T10:42:20.080256Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:42:20.302191Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:42:24 http: TLS handshake error from 10.244.1.1:50676: EOF Pod name: virt-controller-67dcdd8464-shxnk Pod phase: Running level=info timestamp=2018-08-02T10:30:06.494741Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:33:06.732828Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7bb8 kind= uid=71b15de0-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:33:06.734048Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7bb8 kind= uid=71b15de0-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:33:06.767281Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:06.793414Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:07.782195Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmit7bb8, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 71b15de0-963f-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:07.960384Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:33:07.960541Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:36:08.301514Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:36:08.301757Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:39:08.648992Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:39:08.649146Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:42:08.960620Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibfxxr kind= uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:42:08.960837Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibfxxr kind= uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:42:08.997701Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmibfxxr\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmibfxxr" Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T10:26:27.524580Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:42:07.507249Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqzlkp" level=info timestamp=2018-08-02T10:42:08.968773Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: true\n" level=info timestamp=2018-08-02T10:42:08.968843Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:42:08.968866Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.969093Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981181Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.981493Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: true\n" level=info timestamp=2018-08-02T10:42:08.981521Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:42:08.981557Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.981611Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981690Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.985867Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:42:08.985912Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.986277Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.986339Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T10:42:25.403338Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibfxxr kind= uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:42:25.404206Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibfxxr kind= uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:42:25.404378Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibfxxr" level=info timestamp=2018-08-02T10:42:25.424638Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibfxxr, existing: true\n" level=info timestamp=2018-08-02T10:42:25.424732Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:42:25.424756Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:25.424831Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibfxxr kind= uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:42:25.425608Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibfxxr kind= uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:42:25.425721Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibfxxr" level=info timestamp=2018-08-02T10:42:25.466039Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibfxxr, existing: true\n" level=info timestamp=2018-08-02T10:42:25.466209Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:42:25.466240Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:25.466389Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibfxxr kind= uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:42:25.467967Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibfxxr kind= uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:42:25.468134Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibfxxr" Pod name: virt-launcher-testvmibfxxr-mrxkx Pod phase: Running level=info timestamp=2018-08-02T10:42:13.327129Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:42:13.327806Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:42:13.329258Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:42:23.359060Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:42:23.383917Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmibfxxr" level=info timestamp=2018-08-02T10:42:23.386840Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:42:23.387144Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running 2018/08/02 10:44:24 http: TLS handshake error from 10.244.1.1:50748: EOF level=info timestamp=2018-08-02T10:44:28.595374Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:44:29.775616Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:44:34 http: TLS handshake error from 10.244.1.1:50754: EOF level=info timestamp=2018-08-02T10:44:34.976811Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:44:44 http: TLS handshake error from 10.244.1.1:50760: EOF level=info timestamp=2018-08-02T10:44:50.338069Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:44:50.560939Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:44:54 http: TLS handshake error from 10.244.1.1:50766: EOF level=info timestamp=2018-08-02T10:44:58.630039Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:44:59.795280Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:45:03.666588Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:45:03.667811Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:45:04 http: TLS handshake error from 10.244.1.1:50772: EOF level=info timestamp=2018-08-02T10:45:05.035078Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-shxnk Pod phase: Running level=info timestamp=2018-08-02T10:30:06.494741Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid9p86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid9p86" level=info timestamp=2018-08-02T10:33:06.732828Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7bb8 kind= uid=71b15de0-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:33:06.734048Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit7bb8 kind= uid=71b15de0-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:33:06.767281Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:06.793414Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:07.782195Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmit7bb8\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmit7bb8, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 71b15de0-963f-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmit7bb8" level=info timestamp=2018-08-02T10:33:07.960384Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:33:07.960541Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4bjrt kind= uid=726d094e-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:36:08.301514Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:36:08.301757Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmit44xz kind= uid=ddea8c8c-963f-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:39:08.648992Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:39:08.649146Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:42:08.960620Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibfxxr kind= uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:42:08.960837Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibfxxr kind= uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:42:08.997701Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmibfxxr\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmibfxxr" Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T10:26:27.524580Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:42:08.969093Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981181Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.981493Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: true\n" level=info timestamp=2018-08-02T10:42:08.981521Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:42:08.981557Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.981611Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981690Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.985867Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:42:08.985912Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.986277Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.986339Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:44:51.347688Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:44:51.347769Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:44:51.347840Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:44:51.347928Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T10:42:45.881071Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibfxxr kind= uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:42:45.882276Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibfxxr kind= uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:42:45.882401Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibfxxr" level=info timestamp=2018-08-02T10:43:06.362659Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibfxxr, existing: true\n" level=info timestamp=2018-08-02T10:43:06.362775Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:43:06.362805Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:43:06.362932Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibfxxr kind= uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:43:06.364122Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibfxxr kind= uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:43:06.364294Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibfxxr" level=info timestamp=2018-08-02T10:43:47.324628Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibfxxr, existing: true\n" level=info timestamp=2018-08-02T10:43:47.324735Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:43:47.324760Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:43:47.324836Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibfxxr kind= uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:43:47.326453Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibfxxr kind= uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:43:47.326845Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibfxxr" Pod name: virt-launcher-testvmibfxxr-mrxkx Pod phase: Running level=info timestamp=2018-08-02T10:42:13.327129Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:42:13.327806Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:42:13.329258Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:42:23.359060Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:42:23.383917Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmibfxxr" level=info timestamp=2018-08-02T10:42:23.386840Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:42:23.387144Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.333 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 New VirtualMachineInstance with all supported drives /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:413 should have all the device nodes [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:436 Unexpected Warning event received: testvmibfxxr,b4e2d3c7-9640-11e8-9cd7-525500d15501: Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T10:42:09.596440Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmibfxxr kind=VirtualMachineInstance uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 msg="Created virtual machine pod virt-launcher-testvmibfxxr-mrxkx" level=info timestamp=2018-08-02T10:42:25.983397Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmibfxxr kind=VirtualMachineInstance uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmibfxxr-mrxkx" level=error timestamp=2018-08-02T10:42:25.998449Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmibfxxr kind=VirtualMachineInstance uid=b4e2d3c7-9640-11e8-9cd7-525500d15501 reason="unexpected warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running 2018/08/02 10:44:54 http: TLS handshake error from 10.244.1.1:50766: EOF level=info timestamp=2018-08-02T10:44:58.630039Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:44:59.795280Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:45:03.666588Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:45:03.667811Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:45:04 http: TLS handshake error from 10.244.1.1:50772: EOF level=info timestamp=2018-08-02T10:45:05.035078Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:45:14 http: TLS handshake error from 10.244.1.1:50780: EOF level=info timestamp=2018-08-02T10:45:20.391576Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:45:20.618062Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:45:24 http: TLS handshake error from 10.244.1.1:50782: EOF level=info timestamp=2018-08-02T10:45:28.641391Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:45:29.814025Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:45:34 http: TLS handshake error from 10.244.1.1:50788: EOF level=info timestamp=2018-08-02T10:45:35.088340Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T10:45:24.392062Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-08-02T10:45:24.392100Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-08-02T10:45:24.392120Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer limitrangeInformer" level=info timestamp=2018-08-02T10:45:24.392139Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-08-02T10:45:24.392163Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-08-02T10:45:24.392183Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-08-02T10:45:24.392205Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-08-02T10:45:24.392322Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-02T10:45:24.397546Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T10:45:24.403446Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T10:45:24.404514Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T10:45:24.404632Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T10:45:24.575080Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmismtzd kind= uid=295ab0b6-9641-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:45:24.575470Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmismtzd kind= uid=295ab0b6-9641-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:45:24.663109Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmismtzd\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmismtzd" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:42:08.969093Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981181Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.981493Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: true\n" level=info timestamp=2018-08-02T10:42:08.981521Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:42:08.981557Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.981611Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981690Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.985867Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:42:08.985912Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.986277Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.986339Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:44:51.347688Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:44:51.347769Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:44:51.347840Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:44:51.347928Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T10:45:39.041057Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmismtzd kind= uid=295ab0b6-9641-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:45:39.041526Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmismtzd kind= uid=295ab0b6-9641-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:45:39.041596Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmismtzd" level=info timestamp=2018-08-02T10:45:39.046045Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmismtzd, existing: true\n" level=info timestamp=2018-08-02T10:45:39.046070Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:45:39.046103Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:45:39.046149Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmismtzd kind= uid=295ab0b6-9641-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:45:39.046472Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmismtzd kind= uid=295ab0b6-9641-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:45:39.046535Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmismtzd" level=info timestamp=2018-08-02T10:45:39.066728Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmismtzd, existing: true\n" level=info timestamp=2018-08-02T10:45:39.066768Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:45:39.066790Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:45:39.066846Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmismtzd kind= uid=295ab0b6-9641-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:45:39.067276Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmismtzd kind= uid=295ab0b6-9641-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:45:39.067346Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmismtzd" Pod name: virt-launcher-testvmismtzd-vr2b2 Pod phase: Running level=info timestamp=2018-08-02T10:45:27.560226Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:45:27.560493Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:45:27.562094Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:45:37.567999Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:45:37.607988Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmismtzd" level=info timestamp=2018-08-02T10:45:37.610095Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:45:37.610372Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T10:47:28.624829Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:47:29.902403Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:47:34 http: TLS handshake error from 10.244.1.1:50860: EOF level=info timestamp=2018-08-02T10:47:35.301843Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:47:44 http: TLS handshake error from 10.244.1.1:50866: EOF level=info timestamp=2018-08-02T10:47:50.647754Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:47:50.876872Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:47:54 http: TLS handshake error from 10.244.1.1:50872: EOF level=info timestamp=2018-08-02T10:47:58.641551Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:47:59.923899Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:48:04 http: TLS handshake error from 10.244.1.1:50878: EOF level=info timestamp=2018-08-02T10:48:05.355715Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:48:14 http: TLS handshake error from 10.244.1.1:50884: EOF level=info timestamp=2018-08-02T10:48:20.699807Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:48:20.929345Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T10:45:24.392062Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-08-02T10:45:24.392100Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-08-02T10:45:24.392120Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer limitrangeInformer" level=info timestamp=2018-08-02T10:45:24.392139Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-08-02T10:45:24.392163Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-08-02T10:45:24.392183Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-08-02T10:45:24.392205Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-08-02T10:45:24.392322Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-02T10:45:24.397546Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T10:45:24.403446Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T10:45:24.404514Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T10:45:24.404632Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T10:45:24.575080Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmismtzd kind= uid=295ab0b6-9641-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:45:24.575470Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmismtzd kind= uid=295ab0b6-9641-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:45:24.663109Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmismtzd\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmismtzd" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:42:08.969093Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981181Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.981493Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: true\n" level=info timestamp=2018-08-02T10:42:08.981521Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:42:08.981557Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.981611Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981690Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.985867Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:42:08.985912Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.986277Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.986339Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:44:51.347688Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:44:51.347769Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:44:51.347840Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:44:51.347928Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T10:46:19.998231Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmismtzd kind= uid=295ab0b6-9641-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:46:19.998867Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmismtzd kind= uid=295ab0b6-9641-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:46:19.998968Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmismtzd" level=info timestamp=2018-08-02T10:47:00.959888Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmismtzd, existing: true\n" level=info timestamp=2018-08-02T10:47:00.959978Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:47:00.960014Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:47:00.960107Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmismtzd kind= uid=295ab0b6-9641-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:47:00.960647Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmismtzd kind= uid=295ab0b6-9641-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:47:00.961607Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmismtzd" level=info timestamp=2018-08-02T10:48:22.882342Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmismtzd, existing: true\n" level=info timestamp=2018-08-02T10:48:22.882692Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:48:22.882717Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:48:22.882794Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmismtzd kind= uid=295ab0b6-9641-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:48:22.883383Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmismtzd kind= uid=295ab0b6-9641-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:48:22.883455Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmismtzd" Pod name: virt-launcher-testvmismtzd-vr2b2 Pod phase: Running level=info timestamp=2018-08-02T10:45:27.560226Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:45:27.560493Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:45:27.562094Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:45:37.567999Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:45:37.607988Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmismtzd" level=info timestamp=2018-08-02T10:45:37.610095Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:45:37.610372Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [195.390 seconds] LeaderElection /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:43 Start a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:53 when the controller pod is not running /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:54 should success [It] /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:55 Unexpected Warning event received: testvmismtzd,295ab0b6-9641-11e8-9cd7-525500d15501: Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ STEP: Destroying the leading controller pod STEP: Starting a new VirtualMachineInstance level=info timestamp=2018-08-02T10:45:25.050995Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmismtzd kind=VirtualMachineInstance uid=295ab0b6-9641-11e8-9cd7-525500d15501 msg="Created virtual machine pod virt-launcher-testvmismtzd-vr2b2" level=info timestamp=2018-08-02T10:45:39.594561Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmismtzd kind=VirtualMachineInstance uid=295ab0b6-9641-11e8-9cd7-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmismtzd-vr2b2" level=error timestamp=2018-08-02T10:45:39.605706Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmismtzd kind=VirtualMachineInstance uid=295ab0b6-9641-11e8-9cd7-525500d15501 reason="unexpected warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" •• Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T10:52:30.115121Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:52:34 http: TLS handshake error from 10.244.1.1:51040: EOF level=info timestamp=2018-08-02T10:52:35.840938Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:52:44 http: TLS handshake error from 10.244.1.1:51046: EOF level=info timestamp=2018-08-02T10:52:51.172613Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:52:51.386320Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:52:54 http: TLS handshake error from 10.244.1.1:51052: EOF level=info timestamp=2018-08-02T10:52:58.655813Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:53:00.133260Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:53:04 http: TLS handshake error from 10.244.1.1:51058: EOF level=info timestamp=2018-08-02T10:53:05.894832Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:53:14 http: TLS handshake error from 10.244.1.1:51064: EOF level=info timestamp=2018-08-02T10:53:21.225955Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:53:21.450501Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:53:24 http: TLS handshake error from 10.244.1.1:51070: EOF Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T10:48:25.047113Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmijfbc6\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmijfbc6" level=info timestamp=2018-08-02T10:48:25.047168Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijfbc6 kind= uid=94e5493e-9641-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:48:25.047209Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijfbc6 kind= uid=94e5493e-9641-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T10:48:25.053261Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijfbc6 kind= uid=94e5493e-9641-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:48:25.053365Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijfbc6 kind= uid=94e5493e-9641-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T10:48:25.077101Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijfbc6 kind= uid=94e5493e-9641-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:48:25.077179Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijfbc6 kind= uid=94e5493e-9641-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T10:48:25.086230Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijfbc6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijfbc6" level=info timestamp=2018-08-02T10:48:25.092490Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijfbc6 kind= uid=94e5493e-9641-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:48:25.092596Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijfbc6 kind= uid=94e5493e-9641-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T10:48:25.094653Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijfbc6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijfbc6" level=info timestamp=2018-08-02T10:48:40.138784Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijfbc6 kind= uid=94e5493e-9641-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:48:40.138911Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijfbc6 kind= uid=94e5493e-9641-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T10:48:40.162326Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijfbc6 kind= uid=94e5493e-9641-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:48:40.162415Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijfbc6 kind= uid=94e5493e-9641-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:42:08.969093Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981181Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.981493Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: true\n" level=info timestamp=2018-08-02T10:42:08.981521Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:42:08.981557Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.981611Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981690Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.985867Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:42:08.985912Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.986277Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.986339Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:44:51.347688Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:44:51.347769Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:44:51.347840Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:44:51.347928Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T10:50:01.872492Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:50:01.872523Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:50:01.872613Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmijfbc6 kind= uid=94e9ba39-9641-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:50:01.873294Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmijfbc6 kind= uid=94e9ba39-9641-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3396/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:50:01.873392Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3396/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmijfbc6" level=info timestamp=2018-08-02T10:51:06.723741Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmismtzd, existing: false\n" level=info timestamp=2018-08-02T10:51:06.723835Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:51:06.723915Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmismtzd kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:51:06.724021Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmismtzd kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:51:23.793664Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmijfbc6, existing: true\n" level=info timestamp=2018-08-02T10:51:23.793780Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:51:23.793820Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:51:23.793929Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmijfbc6 kind= uid=94e9ba39-9641-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:51:23.794707Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmijfbc6 kind= uid=94e9ba39-9641-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:51:23.794831Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmijfbc6" Pod name: virt-launcher-testvmijfbc6-hsx7q Pod phase: Running level=info timestamp=2018-08-02T10:48:28.420640Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:48:28.421559Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:48:28.423473Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:48:38.430452Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:48:38.471043Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmijfbc6" level=info timestamp=2018-08-02T10:48:38.472594Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:48:38.472792Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure [300.170 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should update VirtualMachine once VMIs are up [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:195 Timed out after 300.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:201 ------------------------------ • [SLOW TEST:10.268 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should remove VirtualMachineInstance once the VMI is marked for deletion /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:204 ------------------------------ • Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T10:57:51.687435Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:57:51.909182Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:57:54 http: TLS handshake error from 10.244.1.1:51232: EOF level=info timestamp=2018-08-02T10:57:58.653928Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:58:00.343097Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:58:04 http: TLS handshake error from 10.244.1.1:51238: EOF level=info timestamp=2018-08-02T10:58:06.483812Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:58:14 http: TLS handshake error from 10.244.1.1:51244: EOF level=info timestamp=2018-08-02T10:58:21.741608Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:58:21.963302Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:58:24 http: TLS handshake error from 10.244.1.1:51250: EOF level=info timestamp=2018-08-02T10:58:28.659758Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:58:30.364590Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:58:34 http: TLS handshake error from 10.244.1.1:51256: EOF level=info timestamp=2018-08-02T10:58:36.537418Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T10:53:37.728762Z pos=vm.go:470 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f4c398e-9642-11e8-9cd7-525500d15501 msg="VirtualMachineInstance created bacause testvmivj59c was added." level=info timestamp=2018-08-02T10:53:37.728823Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f48b92e-9642-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:53:37.728886Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f48b92e-9642-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T10:53:37.729736Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f4c398e-9642-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:53:37.729837Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f4c398e-9642-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:53:37.741364Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f48b92e-9642-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:53:37.741438Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f48b92e-9642-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T10:53:37.760630Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f48b92e-9642-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:53:37.760700Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f48b92e-9642-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T10:53:37.770729Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f48b92e-9642-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:53:37.771337Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f48b92e-9642-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T10:53:52.179254Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f48b92e-9642-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:53:52.179355Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f48b92e-9642-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T10:53:52.200276Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f48b92e-9642-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:53:52.200363Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f48b92e-9642-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:42:08.969093Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981181Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.981493Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: true\n" level=info timestamp=2018-08-02T10:42:08.981521Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:42:08.981557Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.981611Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981690Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.985867Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:42:08.985912Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.986277Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.986339Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:44:51.347688Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:44:51.347769Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:44:51.347840Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:44:51.347928Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T10:54:32.950074Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f4c398e-9642-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:54:32.951382Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f4c398e-9642-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:54:32.951488Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivj59c" level=info timestamp=2018-08-02T10:55:13.911805Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivj59c, existing: true\n" level=info timestamp=2018-08-02T10:55:13.911886Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:55:13.911909Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:55:13.911995Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f4c398e-9642-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:55:13.912607Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f4c398e-9642-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:55:13.912706Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivj59c" level=info timestamp=2018-08-02T10:56:35.832911Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivj59c, existing: true\n" level=info timestamp=2018-08-02T10:56:35.833010Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:56:35.833045Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:56:35.833152Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f4c398e-9642-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:56:35.835156Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmivj59c kind= uid=4f4c398e-9642-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3396/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:56:35.835263Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3396/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivj59c" Pod name: virt-launcher-testvmivj59c-cgwpf Pod phase: Running level=info timestamp=2018-08-02T10:53:41.412040Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:53:41.412307Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:53:41.413646Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:53:51.457879Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:53:51.478166Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivj59c" level=info timestamp=2018-08-02T10:53:51.479248Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:53:51.479401Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure [301.386 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should recreate VirtualMachineInstance if it gets deleted [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:245 Timed out after 300.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:157 ------------------------------ STEP: Starting the VirtualMachineInstance STEP: VMI has the running condition Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running 2018/08/02 10:59:54 http: TLS handshake error from 10.244.1.1:51304: EOF level=info timestamp=2018-08-02T10:59:58.649025Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:00:00.425506Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:00:03.644414Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:00:03.645885Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 11:00:04 http: TLS handshake error from 10.244.1.1:51310: EOF level=info timestamp=2018-08-02T11:00:06.701046Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:00:14 http: TLS handshake error from 10.244.1.1:51316: EOF level=info timestamp=2018-08-02T11:00:21.950293Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:00:22.163070Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:00:24 http: TLS handshake error from 10.244.1.1:51322: EOF level=info timestamp=2018-08-02T11:00:28.636494Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:00:30.445804Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:00:34 http: TLS handshake error from 10.244.1.1:51328: EOF level=info timestamp=2018-08-02T11:00:36.754414Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T10:58:39.101876Z pos=vm.go:470 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ee9989-9643-11e8-9cd7-525500d15501 msg="VirtualMachineInstance created bacause testvmiq8gdh was added." level=info timestamp=2018-08-02T10:58:39.102356Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ee9989-9643-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:58:39.102462Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ee9989-9643-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:58:39.103462Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ed1022-9643-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:58:39.103519Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ed1022-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T10:58:39.114462Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ed1022-9643-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:58:39.114518Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ed1022-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T10:58:39.138310Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ed1022-9643-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:58:39.138419Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ed1022-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T10:58:39.150556Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ed1022-9643-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:58:39.150633Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ed1022-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T10:58:54.461778Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ed1022-9643-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:58:54.461885Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ed1022-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T10:58:54.480849Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ed1022-9643-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T10:58:54.480937Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ed1022-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:42:08.969093Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981181Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.981493Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: true\n" level=info timestamp=2018-08-02T10:42:08.981521Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:42:08.981557Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.981611Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981690Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.985867Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:42:08.985912Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.986277Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.986339Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:44:51.347688Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:44:51.347769Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:44:51.347840Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:44:51.347928Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T10:59:19.675623Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:59:19.675728Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivj59c kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:59:19.675821Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivj59c kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:59:35.226545Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiq8gdh, existing: true\n" level=info timestamp=2018-08-02T10:59:35.226657Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:59:35.226717Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:59:35.226809Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ee9989-9643-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:59:35.228098Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ee9989-9643-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:59:35.228205Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiq8gdh" level=info timestamp=2018-08-02T11:00:16.188788Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiq8gdh, existing: true\n" level=info timestamp=2018-08-02T11:00:16.188878Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:00:16.188923Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:00:16.189029Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ee9989-9643-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:00:16.189712Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiq8gdh kind= uid=02ee9989-9643-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:00:16.189891Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiq8gdh" Pod name: virt-launcher-testvmiq8gdh-njzbq Pod phase: Running level=info timestamp=2018-08-02T10:58:42.218183Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:58:42.218453Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:58:42.219719Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:58:52.226089Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:58:52.283043Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiq8gdh" level=info timestamp=2018-08-02T10:58:52.286000Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:58:52.286259Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [120.307 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should recreate VirtualMachineInstance if the VirtualMachineInstance's pod gets deleted [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:265 Timed out after 120.000s. Expected success, but got an error: <*errors.errorString | 0xc4205ec550>: { s: "vmi still isn't running", } vmi still isn't running /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:284 ------------------------------ STEP: Creating a new VMI STEP: Waiting for the VMI's VirtualMachineInstance to start Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running 2018/08/02 11:04:54 http: TLS handshake error from 10.244.1.1:51484: EOF level=info timestamp=2018-08-02T11:04:58.628088Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:05:00.704456Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:05:03.625525Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:05:03.626637Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 11:05:04 http: TLS handshake error from 10.244.1.1:51490: EOF level=info timestamp=2018-08-02T11:05:07.239024Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:05:14 http: TLS handshake error from 10.244.1.1:51496: EOF level=info timestamp=2018-08-02T11:05:22.474134Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:05:22.691800Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:05:24 http: TLS handshake error from 10.244.1.1:51502: EOF level=info timestamp=2018-08-02T11:05:28.627042Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:05:30.725566Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:05:34 http: TLS handshake error from 10.244.1.1:51508: EOF level=info timestamp=2018-08-02T11:05:37.291932Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:00:39.434310Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizp4vz kind= uid=4aa309c4-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:00:39.434017Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizp4vz kind= uid=4aa79b97-9643-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:00:39.434447Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizp4vz kind= uid=4aa79b97-9643-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:00:39.443102Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizp4vz kind= uid=4aa309c4-9643-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:00:39.443190Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizp4vz kind= uid=4aa309c4-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:00:39.467053Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizp4vz kind= uid=4aa309c4-9643-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:00:39.467124Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizp4vz kind= uid=4aa309c4-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:00:39.480360Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizp4vz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizp4vz" level=info timestamp=2018-08-02T11:00:39.490428Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizp4vz kind= uid=4aa309c4-9643-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:00:39.490495Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizp4vz kind= uid=4aa309c4-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:00:39.493126Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizp4vz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizp4vz" level=info timestamp=2018-08-02T11:00:54.019566Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizp4vz kind= uid=4aa309c4-9643-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:00:54.019676Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizp4vz kind= uid=4aa309c4-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:00:54.040698Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizp4vz kind= uid=4aa309c4-9643-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:00:54.040800Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizp4vz kind= uid=4aa309c4-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:42:08.969093Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981181Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.981493Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: true\n" level=info timestamp=2018-08-02T10:42:08.981521Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:42:08.981557Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.981611Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981690Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.985867Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:42:08.985912Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.986277Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.986339Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:44:51.347688Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:44:51.347769Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:44:51.347840Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:44:51.347928Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:01:38.110322Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:01:38.110403Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiq8gdh kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:01:38.110535Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiq8gdh kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:02:15.748757Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmizp4vz, existing: true\n" level=info timestamp=2018-08-02T11:02:15.748840Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:02:15.748863Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:02:15.748938Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmizp4vz kind= uid=4aa79b97-9643-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:02:15.749515Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmizp4vz kind= uid=4aa79b97-9643-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:02:15.749606Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmizp4vz" level=info timestamp=2018-08-02T11:03:37.669832Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmizp4vz, existing: true\n" level=info timestamp=2018-08-02T11:03:37.669924Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:03:37.669964Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:03:37.670079Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmizp4vz kind= uid=4aa79b97-9643-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:03:37.670772Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmizp4vz kind= uid=4aa79b97-9643-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:03:37.670885Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmizp4vz" Pod name: virt-launcher-testvmizp4vz-44xxm Pod phase: Running level=info timestamp=2018-08-02T11:00:42.938926Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:00:42.939194Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:00:42.941521Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:00:52.951349Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:00:52.973893Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmizp4vz" level=info timestamp=2018-08-02T11:00:52.975598Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:00:52.975796Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [301.358 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should stop VirtualMachineInstance if running set to false [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:325 Timed out after 300.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:157 ------------------------------ STEP: Starting the VirtualMachineInstance STEP: VMI has the running condition Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running 2018/08/02 11:09:54 http: TLS handshake error from 10.244.1.1:51664: EOF level=info timestamp=2018-08-02T11:09:58.663950Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:10:00.913962Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:10:03.675854Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:10:03.677021Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 11:10:04 http: TLS handshake error from 10.244.1.1:51670: EOF level=info timestamp=2018-08-02T11:10:07.779906Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:10:14 http: TLS handshake error from 10.244.1.1:51676: EOF level=info timestamp=2018-08-02T11:10:22.989345Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:10:23.208454Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:10:24 http: TLS handshake error from 10.244.1.1:51682: EOF level=info timestamp=2018-08-02T11:10:28.639958Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:10:30.939488Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:10:34 http: TLS handshake error from 10.244.1.1:51688: EOF level=info timestamp=2018-08-02T11:10:37.833271Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:05:40.784431Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe455445-9643-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:05:40.784532Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe455445-9643-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:05:40.793399Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe422778-9643-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:05:40.796866Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe422778-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:05:40.797525Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe422778-9643-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:05:40.797582Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe422778-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:05:40.837635Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe422778-9643-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:05:40.837704Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe422778-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:05:40.843487Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi8pr5m\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi8pr5m" level=info timestamp=2018-08-02T11:05:40.851253Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe422778-9643-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:05:40.851305Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe422778-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:05:56.254187Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe422778-9643-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:05:56.254259Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe422778-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:05:56.269877Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe422778-9643-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:05:56.270525Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe422778-9643-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:42:08.969093Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981181Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.981493Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: true\n" level=info timestamp=2018-08-02T10:42:08.981521Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:42:08.981557Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.981611Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981690Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.985867Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:42:08.985912Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.986277Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.986339Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:44:51.347688Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:44:51.347769Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:44:51.347840Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:44:51.347928Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:06:37.016763Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe455445-9643-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:06:37.017492Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe455445-9643-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:06:37.017595Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi8pr5m" level=info timestamp=2018-08-02T11:07:17.984323Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi8pr5m, existing: true\n" level=info timestamp=2018-08-02T11:07:17.984416Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:07:17.984446Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:07:17.984543Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe455445-9643-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:07:17.985577Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe455445-9643-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:07:17.985887Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi8pr5m" level=info timestamp=2018-08-02T11:08:39.906504Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi8pr5m, existing: true\n" level=info timestamp=2018-08-02T11:08:39.906602Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:08:39.906632Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:08:39.906746Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe455445-9643-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:08:39.907339Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi8pr5m kind= uid=fe455445-9643-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:08:39.907433Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi8pr5m" Pod name: virt-launcher-testvmi8pr5m-gks9s Pod phase: Running level=info timestamp=2018-08-02T11:05:44.042932Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:05:44.043224Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:05:44.045308Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:05:54.052075Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:05:54.087836Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi8pr5m" level=info timestamp=2018-08-02T11:05:54.089266Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:05:54.089437Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [301.346 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should start and stop VirtualMachineInstance multiple times [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:333 Timed out after 300.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:157 ------------------------------ STEP: Doing run: 0 STEP: Starting the VirtualMachineInstance STEP: VMI has the running condition Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T11:15:53.555043Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:15:53.779204Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:15:54 http: TLS handshake error from 10.244.1.1:51880: EOF level=info timestamp=2018-08-02T11:15:58.673876Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:16:01.181481Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:16:04 http: TLS handshake error from 10.244.1.1:51886: EOF level=info timestamp=2018-08-02T11:16:08.419121Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:16:14 http: TLS handshake error from 10.244.1.1:51892: EOF level=info timestamp=2018-08-02T11:16:23.605389Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:16:23.831370Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:16:24 http: TLS handshake error from 10.244.1.1:51898: EOF level=info timestamp=2018-08-02T11:16:28.672354Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:16:31.201044Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:16:34 http: TLS handshake error from 10.244.1.1:51904: EOF level=info timestamp=2018-08-02T11:16:38.472363Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:16:27.893447Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=b1e01ddb-9644-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:16:27.893498Z pos=vm.go:377 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=b1e01ddb-9644-11e8-9cd7-525500d15501 msg="Setting stabile UUID 'f0da08ae-ca08-5b61-aef7-ef7b500f11f4' (was '')" level=info timestamp=2018-08-02T11:16:27.901334Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=7ffc32e7-9645-11e8-9cd7-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=info timestamp=2018-08-02T11:16:27.901382Z pos=vm.go:470 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=7ffc32e7-9645-11e8-9cd7-525500d15501 msg="VirtualMachineInstance created bacause testvmi7rd59 was added." level=info timestamp=2018-08-02T11:16:27.901435Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=7ffc32e7-9645-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:16:27.901507Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=7ffc32e7-9645-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:16:27.912652Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=b1e01ddb-9644-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:16:27.912735Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=b1e01ddb-9644-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:16:27.919545Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=b1e01ddb-9644-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:16:27.919625Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=b1e01ddb-9644-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:16:27.934525Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=b1e01ddb-9644-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:16:27.934574Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=b1e01ddb-9644-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:16:27.941263Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi7rd59\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi7rd59" level=info timestamp=2018-08-02T11:16:27.948483Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=b1e01ddb-9644-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:16:27.948564Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=b1e01ddb-9644-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:42:08.969093Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981181Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.981493Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: true\n" level=info timestamp=2018-08-02T10:42:08.981521Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:42:08.981557Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.981611Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981690Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.985867Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:42:08.985912Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.986277Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.986339Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:44:51.347688Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:44:51.347769Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:44:51.347840Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:44:51.347928Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:16:24.177642Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=b1e1a527-9644-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:16:24.177715Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi7rd59, existing: true\n" level=info timestamp=2018-08-02T11:16:24.177739Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T11:16:24.177772Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:16:24.177812Z pos=vm.go:421 component=virt-handler namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=b1e1a527-9644-11e8-9cd7-525500d15501 msg="No update processing required" level=info timestamp=2018-08-02T11:16:24.177854Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=b1e1a527-9644-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:16:24.183814Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi7rd59, existing: true\n" level=info timestamp=2018-08-02T11:16:24.183892Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T11:16:24.183924Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:16:24.184012Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=b1e1a527-9644-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:16:24.184203Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi7rd59 kind= uid=b1e1a527-9644-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:16:27.682529Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi7rd59, existing: false\n" level=info timestamp=2018-08-02T11:16:27.682604Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:16:27.682726Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi7rd59 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:16:27.682818Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi7rd59 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi7rd59-phwmv Pod phase: Running level=info timestamp=2018-08-02T11:16:30.676887Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:16:30.677152Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:16:30.679493Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:16:40.684989Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:16:40.728489Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi7rd59" level=info timestamp=2018-08-02T11:16:40.730772Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:16:40.730949Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [360.334 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should not update the VirtualMachineInstance spec if Running [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:346 Timed out after 360.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:353 ------------------------------ Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T11:20:54.073272Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:20:54.299749Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:20:54 http: TLS handshake error from 10.244.1.1:52060: EOF level=info timestamp=2018-08-02T11:20:58.664066Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:21:01.390247Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:21:04 http: TLS handshake error from 10.244.1.1:52066: EOF level=info timestamp=2018-08-02T11:21:08.970249Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:21:14 http: TLS handshake error from 10.244.1.1:52072: EOF level=info timestamp=2018-08-02T11:21:24.128346Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:21:24.355979Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:21:24 http: TLS handshake error from 10.244.1.1:52078: EOF level=info timestamp=2018-08-02T11:21:28.653584Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:21:31.411445Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:21:34 http: TLS handshake error from 10.244.1.1:52084: EOF level=info timestamp=2018-08-02T11:21:39.024061Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:16:42.463251Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a71929-9645-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:16:42.460014Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a96103-9645-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:16:42.463670Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a96103-9645-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:16:42.471907Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a71929-9645-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:16:42.472112Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a71929-9645-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:16:42.472191Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a71929-9645-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:16:42.472253Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a71929-9645-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:16:42.502576Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a71929-9645-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:16:42.503084Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a71929-9645-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:16:42.517641Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a71929-9645-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:16:42.517892Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a71929-9645-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:16:57.375747Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a71929-9645-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:16:57.375859Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a71929-9645-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:16:57.396414Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a71929-9645-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:16:57.396492Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a71929-9645-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:42:08.969093Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981181Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.981493Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: true\n" level=info timestamp=2018-08-02T10:42:08.981521Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:42:08.981557Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.981611Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981690Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.985867Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:42:08.985912Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.986277Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.986339Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:44:51.347688Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:44:51.347769Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:44:51.347840Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:44:51.347928Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:17:38.152166Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a96103-9645-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:17:38.153575Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a96103-9645-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:17:38.153719Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi582qs" level=info timestamp=2018-08-02T11:18:19.113976Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi582qs, existing: true\n" level=info timestamp=2018-08-02T11:18:19.114070Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:18:19.114093Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:18:19.114176Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a96103-9645-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:18:19.115210Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a96103-9645-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:18:19.115302Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi582qs" level=info timestamp=2018-08-02T11:19:41.035955Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi582qs, existing: true\n" level=info timestamp=2018-08-02T11:19:41.036032Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:19:41.036056Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:19:41.036131Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a96103-9645-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:19:41.036766Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi582qs kind= uid=88a96103-9645-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:19:41.036940Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi582qs" Pod name: virt-launcher-testvmi582qs-jz84g Pod phase: Running level=info timestamp=2018-08-02T11:16:45.864148Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:16:45.864436Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:16:45.866949Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:16:55.874263Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:16:55.914734Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi582qs" level=info timestamp=2018-08-02T11:16:55.916981Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:16:55.917208Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [300.352 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should survive guest shutdown, multiple times [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:387 Timed out after 300.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:157 ------------------------------ STEP: Creating new VMI, not running STEP: Starting the VirtualMachineInstance STEP: VMI has the running condition VM testvmiqjk2j was scheduled to start Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T11:26:54.921808Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:26:58.651520Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:27:01.651859Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:27:03.789098Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:27:03.790390Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 11:27:04 http: TLS handshake error from 10.244.1.1:52282: EOF level=info timestamp=2018-08-02T11:27:09.615265Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:27:14 http: TLS handshake error from 10.244.1.1:52288: EOF level=info timestamp=2018-08-02T11:27:24.754741Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:27:24 http: TLS handshake error from 10.244.1.1:52294: EOF level=info timestamp=2018-08-02T11:27:24.974599Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:27:28.658296Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:27:31.671990Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:27:34 http: TLS handshake error from 10.244.1.1:52300: EOF level=info timestamp=2018-08-02T11:27:39.668406Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:27:33.642636Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=0ccb9ab4-9647-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:27:33.646113Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=0ccb9ab4-9647-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:27:33.647860Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=3bada856-9646-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:27:33.647914Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=3bada856-9646-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:27:33.649392Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=3bada856-9646-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:27:33.649437Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=3bada856-9646-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:27:33.662397Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=3bada856-9646-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:27:33.664819Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=3bada856-9646-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:27:33.664926Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=3bada856-9646-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:27:33.665062Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=3bada856-9646-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:27:33.688851Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=3bada856-9646-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:27:33.688925Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=3bada856-9646-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:27:33.695270Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqjk2j\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqjk2j" level=info timestamp=2018-08-02T11:27:33.704694Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=3bada856-9646-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:27:33.704764Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=3bada856-9646-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:42:08.969093Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981181Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.981493Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: true\n" level=info timestamp=2018-08-02T10:42:08.981521Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:42:08.981557Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.981611Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981690Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.985867Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:42:08.985912Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.986277Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.986339Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:44:51.347688Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:44:51.347769Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:44:51.347840Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:44:51.347928Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:27:26.344122Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=3bb08dde-9646-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:27:26.344274Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqjk2j, existing: true\n" level=info timestamp=2018-08-02T11:27:26.344336Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T11:27:26.344365Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:27:26.344448Z pos=vm.go:421 component=virt-handler namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=3bb08dde-9646-11e8-9cd7-525500d15501 msg="No update processing required" level=info timestamp=2018-08-02T11:27:26.344530Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=3bb08dde-9646-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:27:26.351507Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqjk2j, existing: true\n" level=info timestamp=2018-08-02T11:27:26.351541Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T11:27:26.351570Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:27:26.351609Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=3bb08dde-9646-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:27:26.351781Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqjk2j kind= uid=3bb08dde-9646-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:27:33.416655Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqjk2j, existing: false\n" level=info timestamp=2018-08-02T11:27:33.416763Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:27:33.416837Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqjk2j kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:27:33.416923Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqjk2j kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmiqjk2j-zp9r2 Pod phase: Running level=info timestamp=2018-08-02T11:27:36.461881Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:27:36.462190Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:27:36.464635Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" • Failure [360.349 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 Using virtctl interface /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:435 should start a VirtualMachineInstance once [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:436 Timed out after 360.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:453 ------------------------------ STEP: getting an VMI STEP: Invoking virtctl start STEP: Getting the status of the VMI Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running 2018/08/02 11:32:54 http: TLS handshake error from 10.244.1.1:52492: EOF level=info timestamp=2018-08-02T11:32:55.330132Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:32:55.547940Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:32:58.654221Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:33:01.923126Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:33:04 http: TLS handshake error from 10.244.1.1:52498: EOF level=info timestamp=2018-08-02T11:33:10.257631Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:33:14 http: TLS handshake error from 10.244.1.1:52504: EOF 2018/08/02 11:33:24 http: TLS handshake error from 10.244.1.1:52510: EOF level=info timestamp=2018-08-02T11:33:25.380838Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:33:25.592600Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:33:28.662227Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:33:31.946116Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:33:34 http: TLS handshake error from 10.244.1.1:52516: EOF level=info timestamp=2018-08-02T11:33:40.311069Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:33:27.504556Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi56g56 kind= uid=1276d567-9647-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:33:27.504649Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi56g56 kind= uid=1276d567-9647-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:33:27.504717Z pos=vm.go:377 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi56g56 kind= uid=1276d567-9647-11e8-9cd7-525500d15501 msg="Setting stabile UUID 'b2539fef-48a4-5f7b-bac3-aa026f5d6ce3' (was '')" level=info timestamp=2018-08-02T11:33:27.523215Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi56g56 kind= uid=dfb8947d-9647-11e8-9cd7-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=info timestamp=2018-08-02T11:33:27.523354Z pos=vm.go:470 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi56g56 kind= uid=dfb8947d-9647-11e8-9cd7-525500d15501 msg="VirtualMachineInstance created bacause testvmi56g56 was added." level=info timestamp=2018-08-02T11:33:27.523477Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi56g56 kind= uid=dfb8947d-9647-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:33:27.523567Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi56g56 kind= uid=dfb8947d-9647-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:33:27.550277Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi56g56 kind= uid=1276d567-9647-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:33:27.550360Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi56g56 kind= uid=1276d567-9647-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:33:27.565657Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi56g56 kind= uid=1276d567-9647-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:33:27.565737Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi56g56 kind= uid=1276d567-9647-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:33:27.600309Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi56g56 kind= uid=1276d567-9647-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:33:27.600386Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi56g56 kind= uid=1276d567-9647-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T11:33:27.622431Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi56g56 kind= uid=1276d567-9647-11e8-9cd7-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T11:33:27.622503Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi56g56 kind= uid=1276d567-9647-11e8-9cd7-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T10:42:08.969093Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981181Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.981493Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: true\n" level=info timestamp=2018-08-02T10:42:08.981521Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T10:42:08.981557Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.981611Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.981690Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind= uid=4968efff-9640-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:42:08.985867Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:42:08.985912Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:42:08.986277Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:42:08.986339Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:44:51.347688Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqzlkp, existing: false\n" level=info timestamp=2018-08-02T10:44:51.347769Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:44:51.347840Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:44:51.347928Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqzlkp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:33:43.004890Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi56g56 kind= uid=dfb8947d-9647-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:33:43.007382Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi56g56 kind= uid=dfb8947d-9647-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:33:43.026496Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi56g56" level=info timestamp=2018-08-02T11:33:43.026603Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi56g56, existing: true\n" level=info timestamp=2018-08-02T11:33:43.026629Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:33:43.026656Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:33:43.026785Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi56g56 kind= uid=dfb8947d-9647-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:33:43.027349Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi56g56 kind= uid=dfb8947d-9647-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3396/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:33:43.027436Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3396/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi56g56" level=info timestamp=2018-08-02T11:33:43.031765Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi56g56, existing: true\n" level=info timestamp=2018-08-02T11:33:43.031813Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:33:43.031849Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:33:43.031930Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi56g56 kind= uid=dfb8947d-9647-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:33:43.032286Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi56g56 kind= uid=dfb8947d-9647-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:33:43.032373Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi56g56" Pod name: virt-launcher-testvmi56g56-vckdq Pod phase: Running level=info timestamp=2018-08-02T11:33:31.149749Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:33:31.150097Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:33:31.151696Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:33:41.156953Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:33:41.188715Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi56g56" level=info timestamp=2018-08-02T11:33:41.190446Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:33:41.190630Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [360.340 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 Using virtctl interface /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:435 should stop a VirtualMachineInstance once [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:467 Timed out after 360.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:480 ------------------------------ STEP: getting an VMI STEP: Invoking virtctl stop STEP: Ensuring VMI is running • ------------------------------ • [SLOW TEST:8.205 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should scale /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 to five, to six and then to zero replicas /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ •• Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T11:35:04.631941Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 11:35:04 http: TLS handshake error from 10.244.1.1:52570: EOF level=info timestamp=2018-08-02T11:35:10.475295Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:35:14 http: TLS handshake error from 10.244.1.1:52576: EOF 2018/08/02 11:35:24 http: TLS handshake error from 10.244.1.1:52582: EOF level=info timestamp=2018-08-02T11:35:25.596089Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:35:25.809025Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:35:28.687472Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:35:32.031468Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:35:34 http: TLS handshake error from 10.244.1.1:52588: EOF level=info timestamp=2018-08-02T11:35:40.529143Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:35:44 http: TLS handshake error from 10.244.1.1:52594: EOF 2018/08/02 11:35:54 http: TLS handshake error from 10.244.1.1:52600: EOF level=info timestamp=2018-08-02T11:35:55.642791Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:35:55.860382Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:33:49.983770Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirp7sxnzchp kind= uid=ed1d4d5c-9647-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=error timestamp=2018-08-02T11:33:51.810554Z pos=replicaset.go:230 component=virt-controller service=http namespace=kubevirt-test-default name=replicasetdw754 kind= uid=eb7e9bed-9647-11e8-9cd7-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstancereplicasets.kubevirt.io \"replicasetdw754\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the replicaset status failed." level=info timestamp=2018-08-02T11:33:51.811937Z pos=replicaset.go:137 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstancereplicasets.kubevirt.io \"replicasetdw754\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachineInstanceReplicaSet kubevirt-test-default/replicasetdw754" level=info timestamp=2018-08-02T11:33:56.329851Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicxp6ljlswm kind= uid=f0e46b4b-9647-11e8-9cd7-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=error timestamp=2018-08-02T11:33:56.331524Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicxp6ljlswm kind= uid=f0e46b4b-9647-11e8-9cd7-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmicxp6ljlswm" level=info timestamp=2018-08-02T11:33:56.331022Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicxp6ljlswm kind= uid=f0e46b4b-9647-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:33:56.331601Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicxp6ljlswm kind= uid=f0e46b4b-9647-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:33:56.334200Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicxp6ld6772 kind= uid=f0e4c34c-9647-11e8-9cd7-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=error timestamp=2018-08-02T11:33:56.334235Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicxp6ld6772 kind= uid=f0e4c34c-9647-11e8-9cd7-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmicxp6ld6772" level=info timestamp=2018-08-02T11:33:56.334274Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicxp6ld6772 kind= uid=f0e4c34c-9647-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:33:56.334311Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicxp6ld6772 kind= uid=f0e4c34c-9647-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=error timestamp=2018-08-02T11:33:56.364808Z pos=replicaset.go:230 component=virt-controller service=http namespace=kubevirt-test-default name=replicasetczxds kind= uid=f0e06842-9647-11e8-9cd7-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstancereplicasets.kubevirt.io \"replicasetczxds\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the replicaset status failed." level=info timestamp=2018-08-02T11:33:56.380354Z pos=replicaset.go:137 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstancereplicasets.kubevirt.io \"replicasetczxds\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachineInstanceReplicaSet kubevirt-test-default/replicasetczxds" level=info timestamp=2018-08-02T11:33:56.795512Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmicxp6ljlswm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmicxp6ljlswm" level=info timestamp=2018-08-02T11:33:57.400291Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmicxp6ljlswm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmicxp6ljlswm" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:34:32.168920Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ld6772 kind= uid=f0e4c34c-9647-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:34:32.170208Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ld6772 kind= uid=f0e4c34c-9647-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:34:32.170375Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmicxp6ld6772" level=info timestamp=2018-08-02T11:34:52.650659Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmicxp6ld6772, existing: true\n" level=info timestamp=2018-08-02T11:34:52.650739Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:34:52.650769Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:34:52.650872Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ld6772 kind= uid=f0e4c34c-9647-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:34:52.651706Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ld6772 kind= uid=f0e4c34c-9647-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:34:52.651831Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmicxp6ld6772" level=info timestamp=2018-08-02T11:35:33.613468Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmicxp6ld6772, existing: true\n" level=info timestamp=2018-08-02T11:35:33.613542Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:35:33.613565Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:35:33.613647Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ld6772 kind= uid=f0e4c34c-9647-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:35:33.615368Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ld6772 kind= uid=f0e4c34c-9647-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:35:33.615451Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmicxp6ld6772" Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:34:31.730981Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ljlswm kind= uid=f0e46b4b-9647-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:34:31.732881Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ljlswm kind= uid=f0e46b4b-9647-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:34:31.732991Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmicxp6ljlswm" level=info timestamp=2018-08-02T11:34:52.213225Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmicxp6ljlswm, existing: true\n" level=info timestamp=2018-08-02T11:34:52.213341Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:34:52.213435Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:34:52.213583Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ljlswm kind= uid=f0e46b4b-9647-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:34:52.214611Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ljlswm kind= uid=f0e46b4b-9647-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3396/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:34:52.214839Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3396/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmicxp6ljlswm" level=info timestamp=2018-08-02T11:35:33.175086Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmicxp6ljlswm, existing: true\n" level=info timestamp=2018-08-02T11:35:33.175181Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:35:33.175213Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:35:33.175301Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ljlswm kind= uid=f0e46b4b-9647-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:35:33.175961Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ljlswm kind= uid=f0e46b4b-9647-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:35:33.176776Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmicxp6ljlswm" Pod name: virt-launcher-testvmicxp6ld6772-z9j77 Pod phase: Running level=info timestamp=2018-08-02T11:34:00.429489Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:34:00.429767Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:34:00.433979Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:34:10.441883Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:34:10.497457Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmicxp6ld6772" level=info timestamp=2018-08-02T11:34:10.499689Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:34:10.499880Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmicxp6ljlswm-xcs7q Pod phase: Running level=info timestamp=2018-08-02T11:33:59.818003Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:33:59.818368Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:33:59.820643Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:34:09.826830Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:34:09.871593Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmicxp6ljlswm" level=info timestamp=2018-08-02T11:34:09.873132Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:34:09.873309Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure [121.368 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should update readyReplicas once VMIs are up [It] /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:157 Timed out after 120.000s. Expected : 0 to equal : 2 /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:166 ------------------------------ STEP: Create a new VirtualMachineInstance replica set STEP: Scaling to 2 STEP: Checking the number of replicas STEP: Checking the number of ready replicas •• ------------------------------ • [SLOW TEST:5.459 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should not scale when paused and scale when resume /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:223 ------------------------------ ••••••••••••• Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=error timestamp=2018-08-02T11:36:43.441231Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:36:43.441306Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmibmkfg/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:36:44.487190Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:36:44.487278Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmibmkfg/console proto=HTTP/1.1 statusCode=400 contentLength=89 2018/08/02 11:36:44 http: TLS handshake error from 10.244.1.1:52642: EOF level=error timestamp=2018-08-02T11:36:45.541729Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:36:45.541813Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmibmkfg/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:36:46.607654Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:36:46.607781Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmibmkfg/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:36:47.656902Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:36:47.656992Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmibmkfg/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:36:48.706760Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:36:48.706860Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmibmkfg/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:36:49.761729Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:36:49.761827Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmibmkfg/console proto=HTTP/1.1 statusCode=400 contentLength=89 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:36:17.782707Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmisdzf5\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmisdzf5" level=info timestamp=2018-08-02T11:36:18.182247Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmisdzf5\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmisdzf5, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 4501b677-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmisdzf5" level=info timestamp=2018-08-02T11:36:18.632207Z pos=preset.go:167 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipcd8m kind= uid=45b70b1c-9648-11e8-9cd7-525500d15501 msg="VirtualMachineInstance is excluded from VirtualMachinePresets" level=info timestamp=2018-08-02T11:36:18.632296Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipcd8m kind= uid=45b70b1c-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:36:18.796458Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipcd8m\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipcd8m" level=info timestamp=2018-08-02T11:36:18.982283Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipcd8m\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmipcd8m, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 45b70b1c-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipcd8m" level=info timestamp=2018-08-02T11:36:19.831729Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:36:19.831865Z pos=preset.go:255 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="VirtualMachineInstancePreset test-conflict-jk9lv matches VirtualMachineInstance" level=info timestamp=2018-08-02T11:36:19.831909Z pos=preset.go:255 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="VirtualMachineInstancePreset test-memory-f67bm matches VirtualMachineInstance" level=error timestamp=2018-08-02T11:36:19.832239Z pos=preset.go:415 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="VirtualMachinePresets cannot be applied due to conflicts: presets 'test-memory-f67bm' and 'test-conflict-jk9lv' conflict: spec.resources.requests[memory]: {{128 6} {} 128M DecimalSI} != {{256 6} {} 256M DecimalSI}" level=warning timestamp=2018-08-02T11:36:19.832278Z pos=preset.go:157 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as failed" level=info timestamp=2018-08-02T11:36:19.832304Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:36:20.109468Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibmkfg kind= uid=4698dc66-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:36:20.109679Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibmkfg kind= uid=4698dc66-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:36:20.182409Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmisw2ft\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmisw2ft, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 466e20e5-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmisw2ft" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:36:38.070508Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibmkfg kind= uid=4698dc66-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:36:38.071606Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibmkfg kind= uid=4698dc66-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:36:38.071709Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibmkfg" level=info timestamp=2018-08-02T11:36:40.632015Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibmkfg, existing: true\n" level=info timestamp=2018-08-02T11:36:40.632094Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:36:40.632118Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:36:40.632204Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibmkfg kind= uid=4698dc66-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:36:40.632980Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibmkfg kind= uid=4698dc66-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:36:40.633061Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibmkfg" level=info timestamp=2018-08-02T11:36:45.753346Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibmkfg, existing: true\n" level=info timestamp=2018-08-02T11:36:45.753454Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:36:45.753493Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:36:45.753607Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibmkfg kind= uid=4698dc66-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:36:45.754695Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibmkfg kind= uid=4698dc66-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:36:45.754787Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibmkfg" Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:35:57.476963Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ljlswm kind= uid=f0e46b4b-9647-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:35:57.477064Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmicxp6ljlswm, existing: true\n" level=info timestamp=2018-08-02T11:35:57.477088Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T11:35:57.477126Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:35:57.477187Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ljlswm kind= uid=f0e46b4b-9647-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:35:57.477285Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ljlswm kind= uid=f0e46b4b-9647-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:35:57.507468Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmicxp6ljlswm, existing: true\n" level=info timestamp=2018-08-02T11:35:57.507506Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T11:35:57.507534Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:35:57.507575Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ljlswm kind= uid=f0e46b4b-9647-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:35:57.507650Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ljlswm kind= uid=f0e46b4b-9647-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:35:57.538956Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmicxp6ljlswm, existing: false\n" level=info timestamp=2018-08-02T11:35:57.538991Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:35:57.539032Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ljlswm kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:35:57.539107Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ljlswm kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmibmkfg-srt55 Pod phase: Running level=info timestamp=2018-08-02T11:36:24.329356Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:36:24.329671Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:36:24.332225Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:36:34.339070Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:36:34.379284Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmibmkfg" level=info timestamp=2018-08-02T11:36:34.381154Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:36:34.381372Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure [30.184 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 with a cirros image /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:67 should return that we are running cirros [It] /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:68 Expected error: <*errors.errorString | 0xc420608050>: { s: "Timeout trying to connect to the virtual machine instance", } Timeout trying to connect to the virtual machine instance not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:55 ------------------------------ STEP: Creating a new VirtualMachineInstance STEP: Expecting the VirtualMachineInstance console Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=error timestamp=2018-08-02T11:37:13.760956Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:37:13.761043Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmig2mkg/console proto=HTTP/1.1 statusCode=400 contentLength=89 2018/08/02 11:37:14 http: TLS handshake error from 10.244.1.1:52660: EOF level=error timestamp=2018-08-02T11:37:14.808332Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:37:14.808407Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmig2mkg/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:37:15.873147Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:37:15.873213Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmig2mkg/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:37:16.921339Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:37:16.921415Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmig2mkg/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:37:17.965606Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:37:17.965710Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmig2mkg/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:37:19.025546Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:37:19.025657Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmig2mkg/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:37:20.076415Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:37:20.076498Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmig2mkg/console proto=HTTP/1.1 statusCode=400 contentLength=89 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:36:18.632207Z pos=preset.go:167 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipcd8m kind= uid=45b70b1c-9648-11e8-9cd7-525500d15501 msg="VirtualMachineInstance is excluded from VirtualMachinePresets" level=info timestamp=2018-08-02T11:36:18.632296Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipcd8m kind= uid=45b70b1c-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:36:18.796458Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipcd8m\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipcd8m" level=info timestamp=2018-08-02T11:36:18.982283Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipcd8m\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmipcd8m, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 45b70b1c-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipcd8m" level=info timestamp=2018-08-02T11:36:19.831729Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:36:19.831865Z pos=preset.go:255 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="VirtualMachineInstancePreset test-conflict-jk9lv matches VirtualMachineInstance" level=info timestamp=2018-08-02T11:36:19.831909Z pos=preset.go:255 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="VirtualMachineInstancePreset test-memory-f67bm matches VirtualMachineInstance" level=error timestamp=2018-08-02T11:36:19.832239Z pos=preset.go:415 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="VirtualMachinePresets cannot be applied due to conflicts: presets 'test-memory-f67bm' and 'test-conflict-jk9lv' conflict: spec.resources.requests[memory]: {{128 6} {} 128M DecimalSI} != {{256 6} {} 256M DecimalSI}" level=warning timestamp=2018-08-02T11:36:19.832278Z pos=preset.go:157 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as failed" level=info timestamp=2018-08-02T11:36:19.832304Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:36:20.109468Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibmkfg kind= uid=4698dc66-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:36:20.109679Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibmkfg kind= uid=4698dc66-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:36:20.182409Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmisw2ft\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmisw2ft, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 466e20e5-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmisw2ft" level=info timestamp=2018-08-02T11:36:50.428433Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmig2mkg kind= uid=58aaee00-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:36:50.429729Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmig2mkg kind= uid=58aaee00-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:36:50.255254Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:36:50.255330Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibmkfg kind= uid=4698dc66-9648-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:36:50.255429Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibmkfg kind= uid=4698dc66-9648-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:36:50.261868Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibmkfg, existing: false\n" level=info timestamp=2018-08-02T11:36:50.270463Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:36:50.270627Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibmkfg kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:36:50.270714Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibmkfg kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:36:55.535733Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmicxp6ld6772, existing: false\n" level=info timestamp=2018-08-02T11:36:55.535824Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:36:55.536115Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ld6772 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:36:55.536208Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmicxp6ld6772 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:36:55.995096Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibmkfg, existing: false\n" level=info timestamp=2018-08-02T11:36:55.995176Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:36:55.995273Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibmkfg kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:36:55.995351Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibmkfg kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:37:09.846501Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmig2mkg kind= uid=58aaee00-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:37:09.847159Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmig2mkg kind= uid=58aaee00-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:37:09.847267Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmig2mkg" level=info timestamp=2018-08-02T11:37:12.407481Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmig2mkg, existing: true\n" level=info timestamp=2018-08-02T11:37:12.407575Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:37:12.407619Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:12.407765Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmig2mkg kind= uid=58aaee00-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:37:12.408457Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmig2mkg kind= uid=58aaee00-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:37:12.408576Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmig2mkg" level=info timestamp=2018-08-02T11:37:17.528796Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmig2mkg, existing: true\n" level=info timestamp=2018-08-02T11:37:17.528890Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:37:17.528914Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:17.529029Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmig2mkg kind= uid=58aaee00-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:37:17.529786Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmig2mkg kind= uid=58aaee00-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:37:17.529877Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmig2mkg" Pod name: virt-launcher-testvmig2mkg-kbptw Pod phase: Running level=info timestamp=2018-08-02T11:36:55.703797Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:36:55.704070Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:36:55.706270Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:37:05.712317Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:37:05.737498Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmig2mkg" level=info timestamp=2018-08-02T11:37:05.739108Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:37:05.739318Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [30.318 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 with a fedora image /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:77 should return that we are running fedora [It] /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:78 Expected error: <*errors.errorString | 0xc4208a0790>: { s: "Timeout trying to connect to the virtual machine instance", } Timeout trying to connect to the virtual machine instance not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:55 ------------------------------ STEP: Creating a new VirtualMachineInstance STEP: Expecting the VirtualMachineInstance console Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=error timestamp=2018-08-02T11:37:43.766153Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:37:43.766239Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi4l72t/console proto=HTTP/1.1 statusCode=400 contentLength=89 2018/08/02 11:37:44 http: TLS handshake error from 10.244.1.1:52678: EOF level=error timestamp=2018-08-02T11:37:44.826096Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:37:44.826179Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi4l72t/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:37:45.877290Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:37:45.877360Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi4l72t/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:37:46.921584Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:37:46.921667Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi4l72t/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:37:47.966989Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:37:47.967077Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi4l72t/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:37:49.014869Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:37:49.014956Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi4l72t/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:37:50.065665Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:37:50.065813Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmi4l72t/console proto=HTTP/1.1 statusCode=400 contentLength=89 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:36:19.831729Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:36:19.831865Z pos=preset.go:255 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="VirtualMachineInstancePreset test-conflict-jk9lv matches VirtualMachineInstance" level=info timestamp=2018-08-02T11:36:19.831909Z pos=preset.go:255 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="VirtualMachineInstancePreset test-memory-f67bm matches VirtualMachineInstance" level=error timestamp=2018-08-02T11:36:19.832239Z pos=preset.go:415 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="VirtualMachinePresets cannot be applied due to conflicts: presets 'test-memory-f67bm' and 'test-conflict-jk9lv' conflict: spec.resources.requests[memory]: {{128 6} {} 128M DecimalSI} != {{256 6} {} 256M DecimalSI}" level=warning timestamp=2018-08-02T11:36:19.832278Z pos=preset.go:157 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as failed" level=info timestamp=2018-08-02T11:36:19.832304Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:36:20.109468Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibmkfg kind= uid=4698dc66-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:36:20.109679Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibmkfg kind= uid=4698dc66-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:36:20.182409Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmisw2ft\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmisw2ft, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 466e20e5-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmisw2ft" level=info timestamp=2018-08-02T11:36:50.428433Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmig2mkg kind= uid=58aaee00-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:36:50.429729Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmig2mkg kind= uid=58aaee00-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:37:20.749093Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:37:20.749227Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:37:20.817209Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:20.849285Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:37:37.628290Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:37:37.631257Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:37:37.631372Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:40.191574Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: true\n" level=info timestamp=2018-08-02T11:37:40.191707Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:37:40.191731Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:40.191836Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:37:40.193128Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/13153/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:37:40.193246Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/13153/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:45.313443Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: true\n" level=info timestamp=2018-08-02T11:37:45.313539Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:37:45.313581Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:45.313685Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:37:45.314731Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:37:45.315712Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:37:20.335825Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmig2mkg kind= uid=58aaee00-9648-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:20.347255Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmig2mkg kind= uid=58aaee00-9648-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:37:20.347330Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmig2mkg, existing: true\n" level=info timestamp=2018-08-02T11:37:20.347353Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T11:37:20.347379Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:20.347419Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmig2mkg kind= uid=58aaee00-9648-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:20.347503Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmig2mkg kind= uid=58aaee00-9648-11e8-9cd7-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:37:20.354724Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmig2mkg, existing: false\n" level=info timestamp=2018-08-02T11:37:20.354763Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:20.354803Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmig2mkg kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:20.354877Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmig2mkg kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:37:27.770076Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmig2mkg, existing: false\n" level=info timestamp=2018-08-02T11:37:27.770167Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:27.770251Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmig2mkg kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:27.770379Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmig2mkg kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi4l72t-6gdgm Pod phase: Running level=info timestamp=2018-08-02T11:37:24.255476Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:37:24.255811Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:37:24.257445Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:37:34.264669Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:37:34.316300Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi4l72t" level=info timestamp=2018-08-02T11:37:34.317821Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:37:34.318043Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [30.320 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 should be able to reconnect to console multiple times [It] /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:87 Expected error: <*errors.errorString | 0xc4208a0410>: { s: "Timeout trying to connect to the virtual machine instance", } Timeout trying to connect to the virtual machine instance not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:100 ------------------------------ STEP: Creating a new VirtualMachineInstance STEP: Checking that the console output equals to expected one Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=error timestamp=2018-08-02T11:38:14.084660Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:38:14.084786Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmizldhr/console proto=HTTP/1.1 statusCode=400 contentLength=89 2018/08/02 11:38:14 http: TLS handshake error from 10.244.1.1:52696: EOF level=error timestamp=2018-08-02T11:38:15.135850Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:38:15.135933Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmizldhr/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:38:16.191362Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:38:16.191451Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmizldhr/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:38:17.236347Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:38:17.236435Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmizldhr/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:38:18.283170Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:38:18.283247Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmizldhr/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:38:19.333972Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:38:19.334057Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmizldhr/console proto=HTTP/1.1 statusCode=400 contentLength=89 level=error timestamp=2018-08-02T11:38:20.385010Z pos=subresource.go:54 component=virt-api reason="Unable to connect to VirtualMachineInstance because phase is Scheduled instead of Running" msg="Failed to gather remote exec info for subresource request." level=info timestamp=2018-08-02T11:38:20.385092Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2/namespaces/kubevirt-test-default/virtualmachineinstances/testvmizldhr/console proto=HTTP/1.1 statusCode=400 contentLength=89 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=error timestamp=2018-08-02T11:36:19.832239Z pos=preset.go:415 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="VirtualMachinePresets cannot be applied due to conflicts: presets 'test-memory-f67bm' and 'test-conflict-jk9lv' conflict: spec.resources.requests[memory]: {{128 6} {} 128M DecimalSI} != {{256 6} {} 256M DecimalSI}" level=warning timestamp=2018-08-02T11:36:19.832278Z pos=preset.go:157 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as failed" level=info timestamp=2018-08-02T11:36:19.832304Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisw2ft kind= uid=466e20e5-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:36:20.109468Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibmkfg kind= uid=4698dc66-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:36:20.109679Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibmkfg kind= uid=4698dc66-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:36:20.182409Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmisw2ft\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmisw2ft, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 466e20e5-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmisw2ft" level=info timestamp=2018-08-02T11:36:50.428433Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmig2mkg kind= uid=58aaee00-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:36:50.429729Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmig2mkg kind= uid=58aaee00-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:37:20.749093Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:37:20.749227Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:37:20.817209Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:20.849285Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:51.069067Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizldhr kind= uid=7cd0000c-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:37:51.070215Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizldhr kind= uid=7cd0000c-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:37:51.140025Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizldhr\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizldhr" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:37:45.315712Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.872738Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: true\n" level=info timestamp=2018-08-02T11:37:50.872837Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:37:50.872868Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.884214Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=error timestamp=2018-08-02T11:37:50.897252Z pos=vm.go:433 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-08-02T11:37:50.897321Z pos=vm.go:253 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.897924Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:50.898112Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.898178Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:50.898247Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:37:55.556003Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:55.556089Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:55.556175Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:55.556345Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:38:07.441591Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmizldhr kind= uid=7cd0000c-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:38:07.442373Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmizldhr kind= uid=7cd0000c-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:38:07.442478Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmizldhr" level=info timestamp=2018-08-02T11:38:10.002716Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmizldhr, existing: true\n" level=info timestamp=2018-08-02T11:38:10.002807Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:38:10.002842Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:38:10.002948Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmizldhr kind= uid=7cd0000c-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:38:10.003550Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmizldhr kind= uid=7cd0000c-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:38:10.003716Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmizldhr" level=info timestamp=2018-08-02T11:38:15.123975Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmizldhr, existing: true\n" level=info timestamp=2018-08-02T11:38:15.124060Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:38:15.124091Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:38:15.124191Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmizldhr kind= uid=7cd0000c-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:38:15.124961Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmizldhr kind= uid=7cd0000c-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:38:15.125067Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmizldhr" Pod name: virt-launcher-testvmizldhr-rd82z Pod phase: Running level=info timestamp=2018-08-02T11:37:54.612106Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:37:54.613080Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:37:54.614356Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:38:04.622647Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:38:04.644872Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmizldhr" level=info timestamp=2018-08-02T11:38:04.646408Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:38:04.646657Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [30.317 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 should wait until the virtual machine is in running state and return a stream interface [It] /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:103 Expected error: <*errors.errorString | 0xc4209481d0>: { s: "Timeout trying to connect to the virtual machine instance", } Timeout trying to connect to the virtual machine instance not to have occurred /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:109 ------------------------------ STEP: Creating a new VirtualMachineInstance • [SLOW TEST:30.220 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 should fail waiting for the virtual machine instance to be running /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:111 ------------------------------ • [SLOW TEST:30.219 seconds] Console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:37 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:65 with a serial console /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:66 should fail waiting for the expecter /root/go/src/kubevirt.io/kubevirt/tests/console_test.go:134 ------------------------------ Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T11:39:28.674987Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:39:32.204096Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:39:34 http: TLS handshake error from 10.244.1.1:52744: EOF level=info timestamp=2018-08-02T11:39:40.970578Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:39:44 http: TLS handshake error from 10.244.1.1:52750: EOF 2018/08/02 11:39:54 http: TLS handshake error from 10.244.1.1:52756: EOF level=info timestamp=2018-08-02T11:39:56.065195Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:39:56.273765Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:39:58.676006Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:40:02.225713Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:40:03.742625Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:40:03.743898Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 11:40:04 http: TLS handshake error from 10.244.1.1:52762: EOF level=info timestamp=2018-08-02T11:40:11.024735Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:40:14 http: TLS handshake error from 10.244.1.1:52768: EOF Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:37:20.749093Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:37:20.749227Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:37:20.817209Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:20.849285Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:51.069067Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizldhr kind= uid=7cd0000c-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:37:51.070215Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizldhr kind= uid=7cd0000c-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:37:51.140025Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizldhr\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizldhr" level=info timestamp=2018-08-02T11:38:21.385637Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4b7fd kind= uid=8ee1a489-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:38:21.385871Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4b7fd kind= uid=8ee1a489-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:38:51.450648Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4b7fd\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4b7fd, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 8ee1a489-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4b7fd" level=info timestamp=2018-08-02T11:38:51.605105Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9r5sn kind= uid=a0e4f10f-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:38:51.605206Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9r5sn kind= uid=a0e4f10f-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:38:51.666655Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:39:21.622840Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:39:21.623180Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:37:45.315712Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.872738Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: true\n" level=info timestamp=2018-08-02T11:37:50.872837Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:37:50.872868Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.884214Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=error timestamp=2018-08-02T11:37:50.897252Z pos=vm.go:433 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-08-02T11:37:50.897321Z pos=vm.go:253 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.897924Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:50.898112Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.898178Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:50.898247Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:37:55.556003Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:55.556089Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:55.556175Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:55.556345Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:39:45.886852Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:39:45.887457Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:39:45.887532Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmic96fb" level=info timestamp=2018-08-02T11:39:56.127786Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmic96fb, existing: true\n" level=info timestamp=2018-08-02T11:39:56.127867Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:39:56.127897Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:39:56.128004Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:39:56.128905Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:39:56.129033Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmic96fb" level=info timestamp=2018-08-02T11:40:16.609233Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmic96fb, existing: true\n" level=info timestamp=2018-08-02T11:40:16.609321Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:40:16.609355Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:40:16.609476Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:40:16.610210Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:40:16.610309Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmic96fb" Pod name: virt-launcher-testvmi9r5sn-9vtxr Pod phase: Pending Pod name: virt-launcher-testvmic96fb-psj84 Pod phase: Running level=info timestamp=2018-08-02T11:39:25.118252Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:39:25.118536Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:39:25.120566Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:39:35.128066Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:39:35.164196Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmic96fb" level=info timestamp=2018-08-02T11:39:35.165904Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:39:35.166094Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [60.250 seconds] Slirp /root/go/src/kubevirt.io/kubevirt/tests/vmi_slirp_interface_test.go:39 should be able to [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 VirtualMachineInstance with slirp interface /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 30.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1071 ------------------------------ level=info timestamp=2018-08-02T11:39:22.104620Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmic96fb kind=VirtualMachineInstance uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Created virtual machine pod virt-launcher-testvmic96fb-psj84" level=info timestamp=2018-08-02T11:39:36.210236Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmic96fb kind=VirtualMachineInstance uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmic96fb-psj84" level=error timestamp=2018-08-02T11:39:36.235580Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmic96fb kind=VirtualMachineInstance uid=b2c81115-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" level=error timestamp=2018-08-02T11:39:36.242329Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmic96fb kind=VirtualMachineInstance uid=b2c81115-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" level=error timestamp=2018-08-02T11:39:36.249903Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmic96fb kind=VirtualMachineInstance uid=b2c81115-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" level=error timestamp=2018-08-02T11:39:36.264022Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmic96fb kind=VirtualMachineInstance uid=b2c81115-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" level=error timestamp=2018-08-02T11:39:36.305234Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmic96fb kind=VirtualMachineInstance uid=b2c81115-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" level=error timestamp=2018-08-02T11:39:36.389247Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmic96fb kind=VirtualMachineInstance uid=b2c81115-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" level=error timestamp=2018-08-02T11:39:36.547143Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmic96fb kind=VirtualMachineInstance uid=b2c81115-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" level=error timestamp=2018-08-02T11:39:36.869635Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmic96fb kind=VirtualMachineInstance uid=b2c81115-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" level=error timestamp=2018-08-02T11:39:37.512321Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmic96fb kind=VirtualMachineInstance uid=b2c81115-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" level=error timestamp=2018-08-02T11:39:38.795382Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmic96fb kind=VirtualMachineInstance uid=b2c81115-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" level=error timestamp=2018-08-02T11:39:41.354650Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmic96fb kind=VirtualMachineInstance uid=b2c81115-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" level=error timestamp=2018-08-02T11:39:46.479245Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmic96fb kind=VirtualMachineInstance uid=b2c81115-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T11:40:26.118041Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:40:26.332181Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:40:28.686051Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:40:32.247094Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:40:34 http: TLS handshake error from 10.244.1.1:52780: EOF level=info timestamp=2018-08-02T11:40:41.078254Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:40:44 http: TLS handshake error from 10.244.1.1:52786: EOF 2018/08/02 11:40:54 http: TLS handshake error from 10.244.1.1:52792: EOF level=info timestamp=2018-08-02T11:40:56.164877Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:40:56.385193Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:40:58.670398Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:41:02.268721Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:41:04 http: TLS handshake error from 10.244.1.1:52798: EOF level=info timestamp=2018-08-02T11:41:11.132019Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:41:14 http: TLS handshake error from 10.244.1.1:52804: EOF Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:37:20.817209Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:20.849285Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:51.069067Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizldhr kind= uid=7cd0000c-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:37:51.070215Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizldhr kind= uid=7cd0000c-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:37:51.140025Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizldhr\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizldhr" level=info timestamp=2018-08-02T11:38:21.385637Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4b7fd kind= uid=8ee1a489-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:38:21.385871Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4b7fd kind= uid=8ee1a489-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:38:51.450648Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4b7fd\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4b7fd, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 8ee1a489-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4b7fd" level=info timestamp=2018-08-02T11:38:51.605105Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9r5sn kind= uid=a0e4f10f-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:38:51.605206Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9r5sn kind= uid=a0e4f10f-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:38:51.666655Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:39:21.622840Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:39:21.623180Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:40:21.866766Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:40:21.867432Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:37:45.315712Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.872738Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: true\n" level=info timestamp=2018-08-02T11:37:50.872837Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:37:50.872868Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.884214Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=error timestamp=2018-08-02T11:37:50.897252Z pos=vm.go:433 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-08-02T11:37:50.897321Z pos=vm.go:253 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.897924Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:50.898112Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.898178Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:50.898247Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:37:55.556003Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:55.556089Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:55.556175Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:55.556345Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:40:56.495219Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:40:56.496085Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:40:56.496258Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiftfbq" level=info timestamp=2018-08-02T11:40:57.570540Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmic96fb, existing: true\n" level=info timestamp=2018-08-02T11:40:57.570723Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:40:57.570752Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:40:57.570856Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:40:57.575157Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:40:57.575374Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmic96fb" level=info timestamp=2018-08-02T11:41:16.976646Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiftfbq, existing: true\n" level=info timestamp=2018-08-02T11:41:16.976762Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:41:16.976793Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:41:16.976923Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:41:16.977565Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:41:16.977708Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiftfbq" Pod name: virt-launcher-testvmi9r5sn-9vtxr Pod phase: Pending Pod name: virt-launcher-testvmic96fb-psj84 Pod phase: Running level=info timestamp=2018-08-02T11:39:25.118252Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:39:25.118536Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:39:25.120566Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:39:35.128066Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:39:35.164196Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmic96fb" level=info timestamp=2018-08-02T11:39:35.165904Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:39:35.166094Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiftfbq-gkzv5 Pod phase: Running level=info timestamp=2018-08-02T11:40:24.748268Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:40:24.748591Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:40:24.750161Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:40:34.757255Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:40:34.808295Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiftfbq" level=info timestamp=2018-08-02T11:40:34.810437Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:40:34.810702Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [60.446 seconds] Slirp /root/go/src/kubevirt.io/kubevirt/tests/vmi_slirp_interface_test.go:39 should be able to [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 VirtualMachineInstance with slirp interface with custom MAC address /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 30.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1071 ------------------------------ level=info timestamp=2018-08-02T11:40:22.321918Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmiftfbq kind=VirtualMachineInstance uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Created virtual machine pod virt-launcher-testvmiftfbq-gkzv5" level=info timestamp=2018-08-02T11:40:36.568076Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmiftfbq kind=VirtualMachineInstance uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmiftfbq-gkzv5" level=error timestamp=2018-08-02T11:40:36.586366Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmiftfbq kind=VirtualMachineInstance uid=d6b18587-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" level=error timestamp=2018-08-02T11:40:36.593340Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmiftfbq kind=VirtualMachineInstance uid=d6b18587-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" level=error timestamp=2018-08-02T11:40:36.599188Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmiftfbq kind=VirtualMachineInstance uid=d6b18587-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" level=error timestamp=2018-08-02T11:40:36.619296Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmiftfbq kind=VirtualMachineInstance uid=d6b18587-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" level=error timestamp=2018-08-02T11:40:36.660001Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmiftfbq kind=VirtualMachineInstance uid=d6b18587-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" level=error timestamp=2018-08-02T11:40:36.750001Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmiftfbq kind=VirtualMachineInstance uid=d6b18587-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" level=error timestamp=2018-08-02T11:40:36.912307Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmiftfbq kind=VirtualMachineInstance uid=d6b18587-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" level=error timestamp=2018-08-02T11:40:37.234205Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmiftfbq kind=VirtualMachineInstance uid=d6b18587-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" level=error timestamp=2018-08-02T11:40:37.876685Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmiftfbq kind=VirtualMachineInstance uid=d6b18587-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" level=error timestamp=2018-08-02T11:40:39.169806Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmiftfbq kind=VirtualMachineInstance uid=d6b18587-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" level=error timestamp=2018-08-02T11:40:41.720817Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmiftfbq kind=VirtualMachineInstance uid=d6b18587-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" level=error timestamp=2018-08-02T11:40:46.846825Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmiftfbq kind=VirtualMachineInstance uid=d6b18587-9648-11e8-9cd7-525500d15501 reason="Warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running 2018/08/02 11:40:44 http: TLS handshake error from 10.244.1.1:52786: EOF 2018/08/02 11:40:54 http: TLS handshake error from 10.244.1.1:52792: EOF level=info timestamp=2018-08-02T11:40:56.164877Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:40:56.385193Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:40:58.670398Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:41:02.268721Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:41:04 http: TLS handshake error from 10.244.1.1:52798: EOF level=info timestamp=2018-08-02T11:41:11.132019Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:41:14 http: TLS handshake error from 10.244.1.1:52804: EOF 2018/08/02 11:41:24 http: TLS handshake error from 10.244.1.1:52810: EOF level=info timestamp=2018-08-02T11:41:26.218268Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:41:26.431804Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:41:28.663096Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:41:32.290421Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:41:34 http: TLS handshake error from 10.244.1.1:52816: EOF Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:37:51.140025Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizldhr\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizldhr" level=info timestamp=2018-08-02T11:38:21.385637Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4b7fd kind= uid=8ee1a489-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:38:21.385871Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4b7fd kind= uid=8ee1a489-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:38:51.450648Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4b7fd\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4b7fd, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 8ee1a489-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4b7fd" level=info timestamp=2018-08-02T11:38:51.605105Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9r5sn kind= uid=a0e4f10f-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:38:51.605206Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9r5sn kind= uid=a0e4f10f-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:38:51.666655Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:39:21.622840Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:39:21.623180Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:40:21.866766Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:40:21.867432Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:41:22.388395Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9r5sn, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a0e4f10f-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:41:22.398155Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9r5sn, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a0e4f10f-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:41:22.922892Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:41:22.923313Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:37:45.315712Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.872738Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: true\n" level=info timestamp=2018-08-02T11:37:50.872837Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:37:50.872868Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.884214Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=error timestamp=2018-08-02T11:37:50.897252Z pos=vm.go:433 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-08-02T11:37:50.897321Z pos=vm.go:253 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.897924Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:50.898112Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.898178Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:50.898247Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:37:55.556003Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:55.556089Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:55.556175Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:55.556345Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:41:38.398276Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:41:38.398601Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3396/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:41:38.398693Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3396/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4ks6d" level=info timestamp=2018-08-02T11:41:38.419462Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4ks6d, existing: true\n" level=info timestamp=2018-08-02T11:41:38.419531Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:41:38.419561Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:41:38.419653Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:41:38.420188Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3396/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:41:38.420275Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3396/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4ks6d" level=info timestamp=2018-08-02T11:41:38.460482Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4ks6d, existing: true\n" level=info timestamp=2018-08-02T11:41:38.460549Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:41:38.460580Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:41:38.460665Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:41:38.461208Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:41:38.461289Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4ks6d" Pod name: virt-launcher-testvmi4ks6d-dkcjg Pod phase: Running level=info timestamp=2018-08-02T11:41:27.221148Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:41:27.221517Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[OnDefineDomain:[0xc420298280]]" level=info timestamp=2018-08-02T11:41:27.223446Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:41:37.229944Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:41:37.251450Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi4ks6d" level=info timestamp=2018-08-02T11:41:37.252992Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:41:37.253176Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T11:43:28.688765Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:43:32.387536Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:43:34 http: TLS handshake error from 10.244.1.1:52888: EOF level=info timestamp=2018-08-02T11:43:41.398722Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:43:44 http: TLS handshake error from 10.244.1.1:52894: EOF 2018/08/02 11:43:54 http: TLS handshake error from 10.244.1.1:52900: EOF level=info timestamp=2018-08-02T11:43:56.487259Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:43:56.693530Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:43:58.664062Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:44:02.418384Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:44:03.683956Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:44:03.685213Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 11:44:04 http: TLS handshake error from 10.244.1.1:52906: EOF level=info timestamp=2018-08-02T11:44:11.451757Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:44:14 http: TLS handshake error from 10.244.1.1:52912: EOF Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:37:51.140025Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizldhr\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizldhr" level=info timestamp=2018-08-02T11:38:21.385637Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4b7fd kind= uid=8ee1a489-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:38:21.385871Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4b7fd kind= uid=8ee1a489-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:38:51.450648Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4b7fd\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4b7fd, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 8ee1a489-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4b7fd" level=info timestamp=2018-08-02T11:38:51.605105Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9r5sn kind= uid=a0e4f10f-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:38:51.605206Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9r5sn kind= uid=a0e4f10f-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:38:51.666655Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:39:21.622840Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:39:21.623180Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:40:21.866766Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:40:21.867432Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:41:22.388395Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9r5sn, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a0e4f10f-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:41:22.398155Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9r5sn, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a0e4f10f-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:41:22.922892Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:41:22.923313Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:37:45.315712Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.872738Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: true\n" level=info timestamp=2018-08-02T11:37:50.872837Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:37:50.872868Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.884214Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=error timestamp=2018-08-02T11:37:50.897252Z pos=vm.go:433 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-08-02T11:37:50.897321Z pos=vm.go:253 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.897924Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:50.898112Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.898178Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:50.898247Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:37:55.556003Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:55.556089Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:55.556175Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:55.556345Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:42:19.495769Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:42:19.495835Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmic96fb kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:42:19.495918Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmic96fb kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:43:00.313281Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4ks6d, existing: true\n" level=info timestamp=2018-08-02T11:43:00.313363Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:43:00.313388Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:43:00.313463Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:43:00.314348Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:43:00.314649Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4ks6d" level=info timestamp=2018-08-02T11:44:22.235066Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4ks6d, existing: true\n" level=info timestamp=2018-08-02T11:44:22.235152Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:44:22.235182Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:44:22.235273Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:44:22.235975Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:44:22.236092Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4ks6d" Pod name: virt-launcher-testvmi4ks6d-dkcjg Pod phase: Running level=info timestamp=2018-08-02T11:41:27.221148Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:41:27.221517Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[OnDefineDomain:[0xc420298280]]" level=info timestamp=2018-08-02T11:41:27.223446Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:41:37.229944Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:41:37.251450Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi4ks6d" level=info timestamp=2018-08-02T11:41:37.252992Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:41:37.253176Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.726 seconds] HookSidecars /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:40 VMI definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:58 with SM BIOS hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:59 should successfully start with hook sidecar annotation [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:60 Unexpected Warning event received: testvmi4ks6d,fb15ed5b-9648-11e8-9cd7-525500d15501: Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ STEP: Starting a VMI level=info timestamp=2018-08-02T11:41:23.379826Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi4ks6d kind=VirtualMachineInstance uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi4ks6d-dkcjg" level=info timestamp=2018-08-02T11:41:38.956318Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi4ks6d kind=VirtualMachineInstance uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi4ks6d-dkcjg" level=error timestamp=2018-08-02T11:41:38.975613Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi4ks6d kind=VirtualMachineInstance uid=fb15ed5b-9648-11e8-9cd7-525500d15501 reason="unexpected warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T11:43:56.487259Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:43:56.693530Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:43:58.664062Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:44:02.418384Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:44:03.683956Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:44:03.685213Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 11:44:04 http: TLS handshake error from 10.244.1.1:52906: EOF level=info timestamp=2018-08-02T11:44:11.451757Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:44:14 http: TLS handshake error from 10.244.1.1:52912: EOF 2018/08/02 11:44:24 http: TLS handshake error from 10.244.1.1:52918: EOF level=info timestamp=2018-08-02T11:44:26.531101Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:44:26.749059Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:44:28.664649Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:44:32.441458Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:44:34 http: TLS handshake error from 10.244.1.1:52924: EOF Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:38:21.385871Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4b7fd kind= uid=8ee1a489-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:38:51.450648Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4b7fd\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4b7fd, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 8ee1a489-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4b7fd" level=info timestamp=2018-08-02T11:38:51.605105Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9r5sn kind= uid=a0e4f10f-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:38:51.605206Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9r5sn kind= uid=a0e4f10f-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:38:51.666655Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:39:21.622840Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:39:21.623180Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:40:21.866766Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:40:21.867432Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:41:22.388395Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9r5sn, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a0e4f10f-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:41:22.398155Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9r5sn, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a0e4f10f-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:41:22.922892Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:41:22.923313Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:44:23.284296Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:44:23.288119Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:37:45.315712Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.872738Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: true\n" level=info timestamp=2018-08-02T11:37:50.872837Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:37:50.872868Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.884214Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=error timestamp=2018-08-02T11:37:50.897252Z pos=vm.go:433 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-08-02T11:37:50.897321Z pos=vm.go:253 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.897924Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:50.898112Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.898178Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:50.898247Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:37:55.556003Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:55.556089Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:55.556175Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:55.556345Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:44:38.627572Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:44:38.628132Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:44:38.628224Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqvstj" level=info timestamp=2018-08-02T11:44:38.648433Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqvstj, existing: true\n" level=info timestamp=2018-08-02T11:44:38.648516Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:44:38.648542Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:44:38.648641Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:44:38.649304Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:44:38.649398Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqvstj" level=info timestamp=2018-08-02T11:44:38.689695Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqvstj, existing: true\n" level=info timestamp=2018-08-02T11:44:38.689774Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:44:38.689819Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:44:38.689904Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:44:38.690426Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:44:38.690509Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqvstj" Pod name: virt-launcher-testvmiqvstj-xxr96 Pod phase: Running level=info timestamp=2018-08-02T11:44:28.357927Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:44:28.358457Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[OnDefineDomain:[0xc420317f80]]" level=info timestamp=2018-08-02T11:44:28.361057Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:44:38.369158Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:44:38.426412Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiqvstj" level=info timestamp=2018-08-02T11:44:38.430626Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:44:38.430840Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T11:46:28.677935Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:46:32.518008Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:46:34 http: TLS handshake error from 10.244.1.1:52996: EOF level=info timestamp=2018-08-02T11:46:41.718825Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:46:44 http: TLS handshake error from 10.244.1.1:53002: EOF 2018/08/02 11:46:54 http: TLS handshake error from 10.244.1.1:53008: EOF level=info timestamp=2018-08-02T11:46:56.798162Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:46:57.002210Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:46:58.678588Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:47:02.537854Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:47:03.689021Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:47:03.689996Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 11:47:04 http: TLS handshake error from 10.244.1.1:53014: EOF level=info timestamp=2018-08-02T11:47:11.772158Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:47:14 http: TLS handshake error from 10.244.1.1:53020: EOF Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:38:21.385871Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4b7fd kind= uid=8ee1a489-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:38:51.450648Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4b7fd\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4b7fd, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 8ee1a489-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4b7fd" level=info timestamp=2018-08-02T11:38:51.605105Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9r5sn kind= uid=a0e4f10f-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:38:51.605206Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9r5sn kind= uid=a0e4f10f-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:38:51.666655Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:39:21.622840Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:39:21.623180Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:40:21.866766Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:40:21.867432Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:41:22.388395Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9r5sn, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a0e4f10f-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:41:22.398155Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9r5sn, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a0e4f10f-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:41:22.922892Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:41:22.923313Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:44:23.284296Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:44:23.288119Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:37:45.315712Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.872738Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: true\n" level=info timestamp=2018-08-02T11:37:50.872837Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:37:50.872868Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.884214Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=error timestamp=2018-08-02T11:37:50.897252Z pos=vm.go:433 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-08-02T11:37:50.897321Z pos=vm.go:253 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.897924Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:50.898112Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.898178Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:50.898247Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:37:55.556003Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:55.556089Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:55.556175Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:55.556345Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:46:00.542953Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:46:00.542978Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:46:00.543050Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:46:00.543695Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:46:00.543775Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqvstj" level=info timestamp=2018-08-02T11:47:06.076402Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4ks6d, existing: false\n" level=info timestamp=2018-08-02T11:47:06.076494Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:47:06.076582Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4ks6d kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:47:06.076749Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4ks6d kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:47:22.464067Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqvstj, existing: true\n" level=info timestamp=2018-08-02T11:47:22.464155Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:47:22.464189Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:47:22.464293Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:47:22.465031Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:47:22.465137Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqvstj" Pod name: virt-launcher-testvmiqvstj-xxr96 Pod phase: Running level=info timestamp=2018-08-02T11:44:28.357927Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:44:28.358457Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[OnDefineDomain:[0xc420317f80]]" level=info timestamp=2018-08-02T11:44:28.361057Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:44:38.369158Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:44:38.426412Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiqvstj" level=info timestamp=2018-08-02T11:44:38.430626Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:44:38.430840Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.369 seconds] HookSidecars /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:40 VMI definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:58 with SM BIOS hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:59 should call Collect and OnDefineDomain on the hook sidecar [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:67 Unexpected Warning event received: testvmiqvstj,66950389-9649-11e8-9cd7-525500d15501: Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3396/ns/mnt Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ STEP: Getting hook-sidecar logs level=info timestamp=2018-08-02T11:44:23.719542Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiqvstj kind=VirtualMachineInstance uid=66950389-9649-11e8-9cd7-525500d15501 msg="Created virtual machine pod virt-launcher-testvmiqvstj-xxr96" level=info timestamp=2018-08-02T11:44:39.175909Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiqvstj kind=VirtualMachineInstance uid=66950389-9649-11e8-9cd7-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmiqvstj-xxr96" level=error timestamp=2018-08-02T11:44:39.194857Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmiqvstj kind=VirtualMachineInstance uid=66950389-9649-11e8-9cd7-525500d15501 reason="unexpected warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3396/ns/mnt" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T11:46:56.798162Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:46:57.002210Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:46:58.678588Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:47:02.537854Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:47:03.689021Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:47:03.689996Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 11:47:04 http: TLS handshake error from 10.244.1.1:53014: EOF level=info timestamp=2018-08-02T11:47:11.772158Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:47:14 http: TLS handshake error from 10.244.1.1:53020: EOF 2018/08/02 11:47:24 http: TLS handshake error from 10.244.1.1:53026: EOF level=info timestamp=2018-08-02T11:47:26.851998Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:47:27.059667Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:47:28.690336Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:47:32.555126Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:47:34 http: TLS handshake error from 10.244.1.1:53032: EOF Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:38:51.666655Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:39:21.622840Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:39:21.623180Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:40:21.866766Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:40:21.867432Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:41:22.388395Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9r5sn, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a0e4f10f-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:41:22.398155Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9r5sn, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a0e4f10f-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:41:22.922892Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:41:22.923313Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:44:23.284296Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:44:23.288119Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:47:23.627654Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:47:23.628499Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:47:23.664086Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmirvdbf\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmirvdbf" level=info timestamp=2018-08-02T11:47:23.702594Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmirvdbf\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmirvdbf" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:37:45.315712Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.872738Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: true\n" level=info timestamp=2018-08-02T11:37:50.872837Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:37:50.872868Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.884214Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=error timestamp=2018-08-02T11:37:50.897252Z pos=vm.go:433 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-08-02T11:37:50.897321Z pos=vm.go:253 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.897924Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:50.898112Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.898178Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:50.898247Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:37:55.556003Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:55.556089Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:55.556175Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:55.556345Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:47:40.656797Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:47:40.657312Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:47:40.657390Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmirvdbf" level=info timestamp=2018-08-02T11:47:40.677826Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmirvdbf, existing: true\n" level=info timestamp=2018-08-02T11:47:40.677855Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:47:40.677876Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:47:40.677926Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:47:40.678234Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:47:40.678302Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmirvdbf" level=info timestamp=2018-08-02T11:47:40.718515Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmirvdbf, existing: true\n" level=info timestamp=2018-08-02T11:47:40.718579Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:47:40.718606Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:47:40.718712Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:47:40.719162Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:47:40.719236Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmirvdbf" Pod name: virt-launcher-testvmirvdbf-p4tvz Pod phase: Running level=info timestamp=2018-08-02T11:47:29.147798Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:47:29.148240Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[OnDefineDomain:[0xc4202aa580]]" level=info timestamp=2018-08-02T11:47:29.151291Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:47:39.157224Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:47:39.191067Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmirvdbf" level=info timestamp=2018-08-02T11:47:39.192582Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:47:39.192750Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T11:49:28.720215Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:49:32.641068Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:49:34 http: TLS handshake error from 10.244.1.1:53104: EOF level=info timestamp=2018-08-02T11:49:42.038177Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:49:44 http: TLS handshake error from 10.244.1.1:53110: EOF 2018/08/02 11:49:54 http: TLS handshake error from 10.244.1.1:53116: EOF level=info timestamp=2018-08-02T11:49:57.110482Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:49:57.323073Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:49:58.676608Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:50:02.662581Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:50:03.645098Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:50:03.646108Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 11:50:04 http: TLS handshake error from 10.244.1.1:53122: EOF level=info timestamp=2018-08-02T11:50:12.091213Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:50:14 http: TLS handshake error from 10.244.1.1:53128: EOF Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:38:51.666655Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:39:21.622840Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:39:21.623180Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmic96fb kind= uid=b2c81115-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:40:21.866766Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:40:21.867432Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:41:22.388395Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9r5sn, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a0e4f10f-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:41:22.398155Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9r5sn, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a0e4f10f-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:41:22.922892Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:41:22.923313Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:44:23.284296Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:44:23.288119Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:47:23.627654Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:47:23.628499Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:47:23.664086Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmirvdbf\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmirvdbf" level=info timestamp=2018-08-02T11:47:23.702594Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmirvdbf\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmirvdbf" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:37:45.315712Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.872738Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: true\n" level=info timestamp=2018-08-02T11:37:50.872837Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:37:50.872868Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.884214Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=error timestamp=2018-08-02T11:37:50.897252Z pos=vm.go:433 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-08-02T11:37:50.897321Z pos=vm.go:253 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.897924Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:50.898112Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.898178Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:50.898247Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:37:55.556003Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:55.556089Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:55.556175Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:55.556345Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:48:21.608968Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:48:21.608991Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:48:21.609068Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:48:21.609624Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:48:21.609749Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmirvdbf" level=info timestamp=2018-08-02T11:49:02.570072Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmirvdbf, existing: true\n" level=info timestamp=2018-08-02T11:49:02.570160Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:49:02.570200Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:49:02.570297Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:49:02.571590Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:49:02.572251Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmirvdbf" level=info timestamp=2018-08-02T11:50:06.305412Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqvstj, existing: false\n" level=info timestamp=2018-08-02T11:50:06.305492Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:50:06.305559Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiqvstj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:50:06.305636Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiqvstj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmirvdbf-p4tvz Pod phase: Running level=info timestamp=2018-08-02T11:47:29.147798Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:47:29.148240Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[OnDefineDomain:[0xc4202aa580]]" level=info timestamp=2018-08-02T11:47:29.151291Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:47:39.157224Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:47:39.191067Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmirvdbf" level=info timestamp=2018-08-02T11:47:39.192582Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:47:39.192750Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.342 seconds] HookSidecars /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:40 VMI definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:58 with SM BIOS hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:59 should update domain XML with SM BIOS properties [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:83 Unexpected Warning event received: testvmirvdbf,d214ec13-9649-11e8-9cd7-525500d15501: Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ STEP: Reading domain XML using virsh level=info timestamp=2018-08-02T11:47:24.092113Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmirvdbf kind=VirtualMachineInstance uid=d214ec13-9649-11e8-9cd7-525500d15501 msg="Created virtual machine pod virt-launcher-testvmirvdbf-p4tvz" level=info timestamp=2018-08-02T11:47:41.210225Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmirvdbf kind=VirtualMachineInstance uid=d214ec13-9649-11e8-9cd7-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmirvdbf-p4tvz" level=error timestamp=2018-08-02T11:47:41.230428Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmirvdbf kind=VirtualMachineInstance uid=d214ec13-9649-11e8-9cd7-525500d15501 reason="unexpected warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3392/ns/mnt" S [SKIPPING] in Spec Setup (BeforeEach) [0.004 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 should succeed to start a vmi [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:133 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1384 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.003 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 should succeed to stop a running vmi [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:139 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1384 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.003 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 with winrm connection [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:150 should have correct UUID /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:192 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1384 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.003 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 with winrm connection [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:150 should have pod IP /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:208 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1384 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.003 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 with kubectl command [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:226 should succeed to start a vmi /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:242 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1384 ------------------------------ S [SKIPPING] in Spec Setup (BeforeEach) [0.004 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 with kubectl command [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:226 should succeed to stop a vmi /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:250 Skip Windows tests that requires PVC disk-windows /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1384 ------------------------------ • [SLOW TEST:5.419 seconds] User Access /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:33 With default kubevirt service accounts /root/go/src/kubevirt.io/kubevirt/tests/access_test.go:41 should verify permissions are correct for view, edit, and admin /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 given a vmi /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ ••• Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T11:50:42.960808Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:50:43.093048Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:50:43.101039Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:50:43.238121Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:50:43.246770Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:50:43.381642Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:50:43.390422Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:50:43.527074Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:50:43.536468Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:50:43.686904Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:50:43.696284Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:50:44 http: TLS handshake error from 10.244.1.1:53146: EOF 2018/08/02 11:50:54 http: TLS handshake error from 10.244.1.1:53152: EOF level=info timestamp=2018-08-02T11:50:57.214057Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:50:57.426038Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:40:21.866766Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:40:21.867432Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:41:22.388395Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9r5sn, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a0e4f10f-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:41:22.398155Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9r5sn, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a0e4f10f-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:41:22.922892Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:41:22.923313Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:44:23.284296Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:44:23.288119Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:47:23.627654Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:47:23.628499Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:47:23.664086Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmirvdbf\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmirvdbf" level=info timestamp=2018-08-02T11:47:23.702594Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmirvdbf\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmirvdbf" level=info timestamp=2018-08-02T11:50:43.980313Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:50:43.980455Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:50:44.035914Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwf5zq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwf5zq" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:37:45.315712Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.872738Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: true\n" level=info timestamp=2018-08-02T11:37:50.872837Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:37:50.872868Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.884214Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=error timestamp=2018-08-02T11:37:50.897252Z pos=vm.go:433 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-08-02T11:37:50.897321Z pos=vm.go:253 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.897924Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:50.898112Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.898178Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:50.898247Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:37:55.556003Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:55.556089Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:55.556175Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:55.556345Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:50:58.317818Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:50:58.318231Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:50:58.318299Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiwf5zq" level=info timestamp=2018-08-02T11:50:58.338489Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiwf5zq, existing: true\n" level=info timestamp=2018-08-02T11:50:58.338579Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:50:58.338620Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:50:58.338735Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:50:58.339306Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:50:58.339394Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiwf5zq" level=info timestamp=2018-08-02T11:50:58.379759Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiwf5zq, existing: true\n" level=info timestamp=2018-08-02T11:50:58.379843Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:50:58.379874Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:50:58.379973Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:50:58.380441Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:50:58.380516Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiwf5zq" Pod name: virt-launcher-testvmiwf5zq-s2tdn Pod phase: Running level=info timestamp=2018-08-02T11:50:47.137586Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:50:47.137878Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:50:47.139284Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:50:57.161398Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:50:57.185302Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiwf5zq" level=info timestamp=2018-08-02T11:50:57.187006Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:50:57.187154Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running 2018/08/02 11:52:54 http: TLS handshake error from 10.244.1.1:53224: EOF level=info timestamp=2018-08-02T11:52:57.439054Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:52:57.631251Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:52:58.680738Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:53:02.794298Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:53:04 http: TLS handshake error from 10.244.1.1:53230: EOF level=info timestamp=2018-08-02T11:53:12.419096Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:53:14 http: TLS handshake error from 10.244.1.1:53236: EOF 2018/08/02 11:53:24 http: TLS handshake error from 10.244.1.1:53242: EOF level=info timestamp=2018-08-02T11:53:27.492394Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:53:27.684207Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:53:28.682006Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:53:32.814980Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:53:34 http: TLS handshake error from 10.244.1.1:53248: EOF level=info timestamp=2018-08-02T11:53:42.472138Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:40:21.866766Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:40:21.867432Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiftfbq kind= uid=d6b18587-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:41:22.388395Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9r5sn, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a0e4f10f-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:41:22.398155Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9r5sn\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9r5sn, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: a0e4f10f-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9r5sn" level=info timestamp=2018-08-02T11:41:22.922892Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:41:22.923313Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4ks6d kind= uid=fb15ed5b-9648-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:44:23.284296Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:44:23.288119Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqvstj kind= uid=66950389-9649-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:47:23.627654Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:47:23.628499Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirvdbf kind= uid=d214ec13-9649-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:47:23.664086Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmirvdbf\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmirvdbf" level=info timestamp=2018-08-02T11:47:23.702594Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmirvdbf\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmirvdbf" level=info timestamp=2018-08-02T11:50:43.980313Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:50:43.980455Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:50:44.035914Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwf5zq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwf5zq" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:37:45.315712Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.872738Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: true\n" level=info timestamp=2018-08-02T11:37:50.872837Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:37:50.872868Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.884214Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 msg="Processing local ephemeral data cleanup for shutdown domain." level=error timestamp=2018-08-02T11:37:50.897252Z pos=vm.go:433 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind= uid=6abd43e6-9648-11e8-9cd7-525500d15501 reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="Updating the VirtualMachineInstance status failed." level=info timestamp=2018-08-02T11:37:50.897321Z pos=vm.go:253 component=virt-handler reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4l72t\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi4l72t, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6abd43e6-9648-11e8-9cd7-525500d15501, UID in object meta: " msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4l72t" level=info timestamp=2018-08-02T11:37:50.897924Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:50.898112Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:50.898178Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:50.898247Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:37:55.556003Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4l72t, existing: false\n" level=info timestamp=2018-08-02T11:37:55.556089Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:37:55.556175Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:37:55.556345Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi4l72t kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:51:39.271102Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:51:39.271808Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3396/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:51:39.271916Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3396/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiwf5zq" level=info timestamp=2018-08-02T11:52:20.234414Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiwf5zq, existing: true\n" level=info timestamp=2018-08-02T11:52:20.234503Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:52:20.234539Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:52:20.234628Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:52:20.235399Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:52:20.235509Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiwf5zq" level=info timestamp=2018-08-02T11:53:42.155807Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiwf5zq, existing: true\n" level=info timestamp=2018-08-02T11:53:42.155896Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:53:42.155926Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:53:42.156010Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:53:42.157079Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:53:42.157420Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3391/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiwf5zq" Pod name: virt-launcher-testvmiwf5zq-s2tdn Pod phase: Running level=info timestamp=2018-08-02T11:50:47.137586Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:50:47.137878Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:50:47.139284Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:50:57.161398Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:50:57.185302Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiwf5zq" level=info timestamp=2018-08-02T11:50:57.187006Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:50:57.187154Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure [180.173 seconds] Health Monitoring /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:37 A VirtualMachineInstance with a watchdog device /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:56 should be shut down when the watchdog expires [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:57 Unexpected Warning event received: testvmiwf5zq,4980a1ad-964a-11e8-9cd7-525500d15501: Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ STEP: Starting a VirtualMachineInstance level=info timestamp=2018-08-02T11:50:44.489487Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiwf5zq kind=VirtualMachineInstance uid=4980a1ad-964a-11e8-9cd7-525500d15501 msg="Created virtual machine pod virt-launcher-testvmiwf5zq-s2tdn" level=info timestamp=2018-08-02T11:50:58.875228Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiwf5zq kind=VirtualMachineInstance uid=4980a1ad-964a-11e8-9cd7-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmiwf5zq-s2tdn" level=error timestamp=2018-08-02T11:50:58.890796Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmiwf5zq kind=VirtualMachineInstance uid=4980a1ad-964a-11e8-9cd7-525500d15501 reason="unexpected warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running level=info timestamp=2018-08-02T11:53:12.419096Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:53:14 http: TLS handshake error from 10.244.1.1:53236: EOF 2018/08/02 11:53:24 http: TLS handshake error from 10.244.1.1:53242: EOF level=info timestamp=2018-08-02T11:53:27.492394Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:53:27.684207Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:53:28.682006Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:53:32.814980Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:53:34 http: TLS handshake error from 10.244.1.1:53248: EOF level=info timestamp=2018-08-02T11:53:42.472138Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:53:44 http: TLS handshake error from 10.244.1.1:53254: EOF 2018/08/02 11:53:54 http: TLS handshake error from 10.244.1.1:53260: EOF level=info timestamp=2018-08-02T11:53:57.544063Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:53:57.731828Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:53:58.691962Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:54:02.861097Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:50:43.980313Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:50:43.980455Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:50:44.035914Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwf5zq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwf5zq" level=info timestamp=2018-08-02T11:53:44.321986Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiksghm kind= uid=b4fe2a5d-964a-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:53:44.322181Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiksghm kind= uid=b4fe2a5d-964a-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:53:44.328380Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9sp2m kind= uid=b4ffbc6c-964a-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:53:44.328506Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9sp2m kind= uid=b4ffbc6c-964a-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:53:44.340901Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiksbnn kind= uid=b5011bf4-964a-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:53:44.341089Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiksbnn kind= uid=b5011bf4-964a-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:53:44.373545Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipsmdt kind= uid=b502d2bd-964a-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:53:44.373666Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipsmdt kind= uid=b502d2bd-964a-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:53:44.425671Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9sp2m\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9sp2m" level=info timestamp=2018-08-02T11:53:44.446069Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiksghm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiksghm" level=info timestamp=2018-08-02T11:53:45.129599Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipsmdt\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipsmdt" level=info timestamp=2018-08-02T11:53:45.526553Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipsmdt\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipsmdt" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:54:00.606486Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiksbnn kind= uid=b5011bf4-964a-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:54:00.608318Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiksbnn kind= uid=b5011bf4-964a-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:54:00.611526Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiksbnn" level=info timestamp=2018-08-02T11:54:01.256055Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiksbnn, existing: true\n" level=info timestamp=2018-08-02T11:54:01.256141Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:54:01.256171Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:54:01.256280Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiksbnn kind= uid=b5011bf4-964a-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:54:01.383541Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiksbnn kind= uid=b5011bf4-964a-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:54:01.383756Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5367/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiksbnn" level=info timestamp=2018-08-02T11:54:02.664022Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiksbnn, existing: true\n" level=info timestamp=2018-08-02T11:54:02.664097Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:54:02.664119Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:54:02.664249Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiksbnn kind= uid=b5011bf4-964a-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:54:02.666427Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiksbnn kind= uid=b5011bf4-964a-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5366/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:54:02.666787Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5366/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiksbnn" Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:54:03.123922Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiksghm kind= uid=b4fe2a5d-964a-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:54:03.124380Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiksghm kind= uid=b4fe2a5d-964a-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:54:03.124471Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiksghm" level=info timestamp=2018-08-02T11:54:03.144648Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiksghm, existing: true\n" level=info timestamp=2018-08-02T11:54:03.144736Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:54:03.144766Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:54:03.144851Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiksghm kind= uid=b4fe2a5d-964a-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:54:03.145293Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiksghm kind= uid=b4fe2a5d-964a-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:54:03.145384Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiksghm" level=info timestamp=2018-08-02T11:54:03.185630Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiksghm, existing: true\n" level=info timestamp=2018-08-02T11:54:03.185745Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:54:03.185777Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:54:03.185887Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiksghm kind= uid=b4fe2a5d-964a-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:54:03.186483Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiksghm kind= uid=b4fe2a5d-964a-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:54:03.186594Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3388/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiksghm" Pod name: virt-launcher-testvmi9sp2m-g549r Pod phase: Running level=info timestamp=2018-08-02T11:53:49.557506Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:53:49.559474Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:53:49.560941Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:54:01.289347Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:54:01.322278Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi9sp2m" level=info timestamp=2018-08-02T11:54:01.324703Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:54:01.324870Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiksbnn-hjzx5 Pod phase: Running level=info timestamp=2018-08-02T11:53:47.778163Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:53:47.778469Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:53:47.779861Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:53:57.788035Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:53:57.829371Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiksbnn" level=info timestamp=2018-08-02T11:53:57.832116Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:53:57.832303Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiksghm-swtvv Pod phase: Running level=info timestamp=2018-08-02T11:53:49.081851Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:53:49.082096Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:53:49.083355Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:54:01.699934Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:54:01.758704Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiksghm" level=info timestamp=2018-08-02T11:54:01.761027Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:54:01.765910Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmipsmdt-vbsn2 Pod phase: Running level=info timestamp=2018-08-02T11:53:49.850253Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:53:49.850560Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:53:49.853351Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:54:02.165455Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:54:02.187188Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmipsmdt" level=info timestamp=2018-08-02T11:54:02.188933Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:54:02.189108Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running 2018/08/02 11:55:54 http: TLS handshake error from 10.244.1.1:53332: EOF level=info timestamp=2018-08-02T11:55:57.752192Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:55:57.938126Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:55:58.682607Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:56:02.978314Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:56:04 http: TLS handshake error from 10.244.1.1:53338: EOF level=info timestamp=2018-08-02T11:56:12.775283Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:56:14 http: TLS handshake error from 10.244.1.1:53344: EOF 2018/08/02 11:56:24 http: TLS handshake error from 10.244.1.1:53350: EOF level=info timestamp=2018-08-02T11:56:27.806386Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:56:27.992094Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:56:28.696036Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:56:32.998098Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:56:34 http: TLS handshake error from 10.244.1.1:53356: EOF level=info timestamp=2018-08-02T11:56:42.830018Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:50:43.980313Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:50:43.980455Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwf5zq kind= uid=4980a1ad-964a-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:50:44.035914Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiwf5zq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiwf5zq" level=info timestamp=2018-08-02T11:53:44.321986Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiksghm kind= uid=b4fe2a5d-964a-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:53:44.322181Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiksghm kind= uid=b4fe2a5d-964a-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:53:44.328380Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9sp2m kind= uid=b4ffbc6c-964a-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:53:44.328506Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9sp2m kind= uid=b4ffbc6c-964a-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:53:44.340901Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiksbnn kind= uid=b5011bf4-964a-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:53:44.341089Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiksbnn kind= uid=b5011bf4-964a-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:53:44.373545Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipsmdt kind= uid=b502d2bd-964a-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:53:44.373666Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipsmdt kind= uid=b502d2bd-964a-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:53:44.425671Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9sp2m\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9sp2m" level=info timestamp=2018-08-02T11:53:44.446069Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiksghm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiksghm" level=info timestamp=2018-08-02T11:53:45.129599Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipsmdt\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipsmdt" level=info timestamp=2018-08-02T11:53:45.526553Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipsmdt\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipsmdt" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:54:41.076416Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiksbnn kind= uid=b5011bf4-964a-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:54:41.078061Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiksbnn kind= uid=b5011bf4-964a-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5369/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:54:41.078660Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5369/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiksbnn" level=info timestamp=2018-08-02T11:55:22.039022Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiksbnn, existing: true\n" level=info timestamp=2018-08-02T11:55:22.039108Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:55:22.039139Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:55:22.039247Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiksbnn kind= uid=b5011bf4-964a-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:55:22.039895Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiksbnn kind= uid=b5011bf4-964a-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5369/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:55:22.040394Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5369/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiksbnn" level=info timestamp=2018-08-02T11:56:43.960703Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiksbnn, existing: true\n" level=info timestamp=2018-08-02T11:56:43.960786Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:56:43.960816Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:56:43.960913Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiksbnn kind= uid=b5011bf4-964a-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:56:43.961623Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiksbnn kind= uid=b5011bf4-964a-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5366/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:56:43.961730Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5366/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiksbnn" Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:55:25.043144Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:55:25.043177Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:55:25.043319Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiksghm kind= uid=b4fe2a5d-964a-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:55:25.044406Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiksghm kind= uid=b4fe2a5d-964a-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:55:25.044499Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3757/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiksghm" level=info timestamp=2018-08-02T11:55:25.797601Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipsmdt, existing: true\n" level=info timestamp=2018-08-02T11:55:25.797715Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:55:25.797753Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:55:25.797863Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmipsmdt kind= uid=b502d2bd-964a-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:55:25.798576Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmipsmdt kind= uid=b502d2bd-964a-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:55:25.798665Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmipsmdt" level=info timestamp=2018-08-02T11:56:25.997831Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiwf5zq, existing: false\n" level=info timestamp=2018-08-02T11:56:25.997906Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:56:25.997963Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiwf5zq kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:56:25.998033Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiwf5zq kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi9sp2m-g549r Pod phase: Running level=info timestamp=2018-08-02T11:53:49.557506Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:53:49.559474Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:53:49.560941Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:54:01.289347Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:54:01.322278Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi9sp2m" level=info timestamp=2018-08-02T11:54:01.324703Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:54:01.324870Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiksbnn-hjzx5 Pod phase: Running level=info timestamp=2018-08-02T11:53:47.778163Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:53:47.778469Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:53:47.779861Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:53:57.788035Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:53:57.829371Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiksbnn" level=info timestamp=2018-08-02T11:53:57.832116Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:53:57.832303Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiksghm-swtvv Pod phase: Running level=info timestamp=2018-08-02T11:53:49.081851Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:53:49.082096Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:53:49.083355Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:54:01.699934Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:54:01.758704Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiksghm" level=info timestamp=2018-08-02T11:54:01.761027Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:54:01.765910Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmipsmdt-vbsn2 Pod phase: Running level=info timestamp=2018-08-02T11:53:49.850253Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:53:49.850560Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:53:49.853351Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:54:02.165455Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:54:02.187188Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmipsmdt" level=info timestamp=2018-08-02T11:54:02.188933Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:54:02.189108Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [180.887 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be able to reach [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 the Inbound VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Unexpected Warning event received: testvmiksghm,b4fe2a5d-964a-11e8-9cd7-525500d15501: Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T11:53:44.901047Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiksghm kind=VirtualMachineInstance uid=b4fe2a5d-964a-11e8-9cd7-525500d15501 msg="Created virtual machine pod virt-launcher-testvmiksghm-swtvv" level=info timestamp=2018-08-02T11:54:03.682689Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiksghm kind=VirtualMachineInstance uid=b4fe2a5d-964a-11e8-9cd7-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmiksghm-swtvv" level=error timestamp=2018-08-02T11:54:03.695626Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmiksghm kind=VirtualMachineInstance uid=b4fe2a5d-964a-11e8-9cd7-525500d15501 reason="unexpected warning event received" msg="Failed to open current namespace: Error detecting namespace type from path: /proc/3361/task/3361/ns/mnt" Pod name: disks-images-provider-4qn9k Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-t2mkf Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-crmbq Pod phase: Running 2018/08/02 11:56:04 http: TLS handshake error from 10.244.1.1:53338: EOF level=info timestamp=2018-08-02T11:56:12.775283Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:56:14 http: TLS handshake error from 10.244.1.1:53344: EOF 2018/08/02 11:56:24 http: TLS handshake error from 10.244.1.1:53350: EOF level=info timestamp=2018-08-02T11:56:27.806386Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:56:27.992094Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:56:28.696036Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T11:56:32.998098Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:56:34 http: TLS handshake error from 10.244.1.1:53356: EOF level=info timestamp=2018-08-02T11:56:42.830018Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 11:56:44 http: TLS handshake error from 10.244.1.1:53362: EOF 2018/08/02 11:56:54 http: TLS handshake error from 10.244.1.1:53368: EOF level=info timestamp=2018-08-02T11:56:57.857452Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:56:58.045464Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T11:56:58.693192Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-67dcdd8464-v5j45 Pod phase: Running level=info timestamp=2018-08-02T11:53:44.373666Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipsmdt kind= uid=b502d2bd-964a-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:53:44.425671Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9sp2m\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9sp2m" level=info timestamp=2018-08-02T11:53:44.446069Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiksghm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiksghm" level=info timestamp=2018-08-02T11:53:45.129599Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipsmdt\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipsmdt" level=info timestamp=2018-08-02T11:53:45.526553Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipsmdt\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipsmdt" level=info timestamp=2018-08-02T11:56:45.808681Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipkjwz kind= uid=212b4af4-964b-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:56:45.812818Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipkjwz kind= uid=212b4af4-964b-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:56:45.817408Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7p5s6 kind= uid=212d0830-964b-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:56:45.817538Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7p5s6 kind= uid=212d0830-964b-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:56:45.833928Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikmvcn kind= uid=212e263c-964b-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:56:45.835664Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikmvcn kind= uid=212e263c-964b-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:56:45.840665Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmil4wpx kind= uid=2130336d-964b-11e8-9cd7-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T11:56:45.840769Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmil4wpx kind= uid=2130336d-964b-11e8-9cd7-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T11:56:45.969703Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi7p5s6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi7p5s6" level=info timestamp=2018-08-02T11:56:45.976646Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipkjwz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipkjwz" Pod name: virt-controller-67dcdd8464-wxgxp Pod phase: Running level=info timestamp=2018-08-02T10:45:11.765482Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-snkp5 Pod phase: Running level=info timestamp=2018-08-02T11:57:00.739339Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmipkjwz kind= uid=212b4af4-964b-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:57:00.739664Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmipkjwz kind= uid=212b4af4-964b-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5369/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:57:00.739734Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5369/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmipkjwz" level=info timestamp=2018-08-02T11:57:00.759926Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipkjwz, existing: true\n" level=info timestamp=2018-08-02T11:57:00.760029Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:57:00.760059Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:57:00.760155Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmipkjwz kind= uid=212b4af4-964b-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:57:00.760723Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmipkjwz kind= uid=212b4af4-964b-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:57:00.760811Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5370/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmipkjwz" level=info timestamp=2018-08-02T11:57:00.801081Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipkjwz, existing: true\n" level=info timestamp=2018-08-02T11:57:00.801163Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T11:57:00.801188Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:57:00.801273Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmipkjwz kind= uid=212b4af4-964b-11e8-9cd7-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T11:57:00.802057Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmipkjwz kind= uid=212b4af4-964b-11e8-9cd7-525500d15501 reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T11:57:00.802843Z pos=vm.go:253 component=virt-handler reason="Failed to open current namespace: Error detecting namespace type from path: /proc/5307/task/5360/ns/mnt" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmipkjwz" Pod name: virt-handler-znjs4 Pod phase: Running level=info timestamp=2018-08-02T11:56:44.953204Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:56:44.953254Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmipsmdt kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:56:44.953314Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmipsmdt kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:56:45.899638Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi9sp2m, existing: false\n" level=info timestamp=2018-08-02T11:56:45.899753Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:56:45.899812Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi9sp2m kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:56:45.899928Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi9sp2m kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:56:46.965477Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiksghm, existing: false\n" level=info timestamp=2018-08-02T11:56:46.965562Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:56:46.965824Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiksghm kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:56:46.965899Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiksghm kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T11:56:47.718952Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipsmdt, existing: false\n" level=info timestamp=2018-08-02T11:56:47.720117Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T11:56:47.720335Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmipsmdt kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T11:56:47.720425Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmipsmdt kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi7p5s6-8l47c Pod phase: Running level=info timestamp=2018-08-02T11:56:51.637399Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:56:51.637696Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:56:51.639744Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" Pod name: virt-launcher-testvmikmvcn-cd2ts Pod phase: Running level=info timestamp=2018-08-02T11:56:51.879021Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:56:51.879363Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:56:51.881366Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" Pod name: virt-launcher-testvmil4wpx-h4wr2 Pod phase: Running level=info timestamp=2018-08-02T11:56:51.994134Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:56:51.994520Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:56:51.997021Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" Pod name: virt-launcher-testvmipkjwz-fhjjd Pod phase: Running level=info timestamp=2018-08-02T11:56:49.682903Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T11:56:49.683235Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T11:56:49.685489Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T11:56:59.692557Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T11:56:59.735453Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmipkjwz" level=info timestamp=2018-08-02T11:56:59.737156Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T11:56:59.737318Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmipsmdt-vbsn2 Pod phase: Failed panic: test timed out after 1h30m0s goroutine 11343 [running]: testing.(*M).startAlarm.func1() /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1240 +0xfc created by time.goFunc /gimme/.gimme/versions/go1.10.linux.amd64/src/time/sleep.go:172 +0x44 goroutine 1 [chan receive, 90 minutes]: testing.(*T).Run(0xc4208bca50, 0x139e775, 0x9, 0x1430cc8, 0x4801e6) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:825 +0x301 testing.runTests.func1(0xc4208bc960) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1063 +0x64 testing.tRunner(0xc4208bc960, 0xc420755df8) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:777 +0xd0 testing.runTests(0xc4203c71e0, 0x1d32a50, 0x1, 0x1, 0x412009) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1061 +0x2c4 testing.(*M).Run(0xc420860200, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:978 +0x171 main.main() _testmain.go:44 +0x151 goroutine 20 [chan receive]: kubevirt.io/kubevirt/vendor/github.com/golang/glog.(*loggingT).flushDaemon(0x1d5e280) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/golang/glog/glog.go:879 +0x8b created by kubevirt.io/kubevirt/vendor/github.com/golang/glog.init.0 /root/go/src/kubevirt.io/kubevirt/vendor/github.com/golang/glog/glog.go:410 +0x203 goroutine 21 [syscall, 90 minutes]: os/signal.signal_recv(0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/sigqueue.go:139 +0xa6 os/signal.loop() /gimme/.gimme/versions/go1.10.linux.amd64/src/os/signal/signal_unix.go:22 +0x22 created by os/signal.init.0 /gimme/.gimme/versions/go1.10.linux.amd64/src/os/signal/signal_unix.go:28 +0x41 goroutine 24 [select]: kubevirt.io/kubevirt/tests.(*ObjectEventWatcher).Watch(0xc420d0f030, 0xc42040a330) /root/go/src/kubevirt.io/kubevirt/tests/utils.go:285 +0x579 kubevirt.io/kubevirt/tests.(*ObjectEventWatcher).WaitFor(0xc420d0f030, 0x139a511, 0x6, 0x11e3660, 0x14b11f0, 0x0) /root/go/src/kubevirt.io/kubevirt/tests/utils.go:295 +0xba kubevirt.io/kubevirt/tests.waitForVMIStart(0x14ba840, 0xc420a20f00, 0x5a, 0x0, 0x0, 0x0) /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1051 +0x50e kubevirt.io/kubevirt/tests.WaitForSuccessfulVMIStart(0x14ba840, 0xc420a20f00, 0xc420d0f120, 0xa5a8de) /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1088 +0x43 kubevirt.io/kubevirt/tests.WaitUntilVMIReady(0xc420a20f00, 0x1430ca0, 0x1d7c938) /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1093 +0x3b kubevirt.io/kubevirt/tests_test.glob..func18.4() /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:135 +0x72b kubevirt.io/kubevirt/tests.BeforeAll.func1() /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1375 +0x3f kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*runner).runSync(0xc4207f9c20, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/runner.go:113 +0x9c kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*runner).run(0xc4207f9c20, 0xc4203a6b80, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/runner.go:64 +0x13e kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*SetupNode).Run(0xc4204c4658, 0x14b6ce0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/setup_nodes.go:15 +0x7f kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec.(*Spec).runSample(0xc4206362d0, 0x0, 0x14b6ce0, 0xc4200dd4c0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec/spec.go:181 +0x1f1 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec.(*Spec).Run(0xc4206362d0, 0x14b6ce0, 0xc4200dd4c0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec/spec.go:138 +0xff kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).runSpec(0xc42061f900, 0xc4206362d0, 0x0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:200 +0x10d kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).runSpecs(0xc42061f900, 0x1) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:170 +0x329 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).Run(0xc42061f900, 0xb) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:66 +0x11b kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/suite.(*Suite).Run(0xc4200daaf0, 0x7fc7a57d0870, 0xc4208bca50, 0x13a0d58, 0xb, 0xc4203c7240, 0x2, 0x2, 0x14d35e0, 0xc4200dd4c0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/suite/suite.go:62 +0x27c kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo.RunSpecsWithCustomReporters(0x14b7d40, 0xc4208bca50, 0x13a0d58, 0xb, 0xc4203c7220, 0x2, 0x2, 0x2) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/ginkgo_dsl.go:221 +0x258 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo.RunSpecsWithDefaultAndCustomReporters(0x14b7d40, 0xc4208bca50, 0x13a0d58, 0xb, 0xc4203a73e0, 0x1, 0x1, 0x1) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/ginkgo_dsl.go:209 +0xab kubevirt.io/kubevirt/tests_test.TestTests(0xc4208bca50) /root/go/src/kubevirt.io/kubevirt/tests/tests_suite_test.go:43 +0xaa testing.tRunner(0xc4208bca50, 0x1430cc8) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:777 +0xd0 created by testing.(*T).Run /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:824 +0x2e0 goroutine 25 [chan receive, 90 minutes]: kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).registerForInterrupts(0xc42061f900, 0xc4200c1c80) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:223 +0xd1 created by kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).Run /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:60 +0x88 goroutine 26 [select, 90 minutes, locked to thread]: runtime.gopark(0x1432ea0, 0x0, 0x139b297, 0x6, 0x18, 0x1) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/proc.go:291 +0x11a runtime.selectgo(0xc420078f50, 0xc4200c1d40) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/select.go:392 +0xe50 runtime.ensureSigM.func1() /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/signal_unix.go:549 +0x1f4 runtime.goexit() /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/asm_amd64.s:2361 +0x1 goroutine 51 [IO wait]: internal/poll.runtime_pollWait(0x7fc7a57bbff8, 0x72, 0xc42029d850) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/netpoll.go:173 +0x57 internal/poll.(*pollDesc).wait(0xc420448018, 0x72, 0xffffffffffffff00, 0x14b8f00, 0x1c497d0) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_poll_runtime.go:85 +0x9b internal/poll.(*pollDesc).waitRead(0xc420448018, 0xc420a0c000, 0x8000, 0x8000) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_poll_runtime.go:90 +0x3d internal/poll.(*FD).Read(0xc420448000, 0xc420a0c000, 0x8000, 0x8000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_unix.go:157 +0x17d net.(*netFD).Read(0xc420448000, 0xc420a0c000, 0x8000, 0x8000, 0x0, 0x8, 0x7ffb) /gimme/.gimme/versions/go1.10.linux.amd64/src/net/fd_unix.go:202 +0x4f net.(*conn).Read(0xc4200d8018, 0xc420a0c000, 0x8000, 0x8000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/net/net.go:176 +0x6a crypto/tls.(*block).readFromUntil(0xc420650090, 0x7fc7a57138e0, 0xc4200d8018, 0x5, 0xc4200d8018, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:493 +0x96 crypto/tls.(*Conn).readRecord(0xc420458000, 0x1432f17, 0xc420458120, 0x20) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:595 +0xe0 crypto/tls.(*Conn).Read(0xc420458000, 0xc42042d000, 0x1000, 0x1000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:1156 +0x100 bufio.(*Reader).Read(0xc42056c2a0, 0xc420410118, 0x9, 0x9, 0xc420456ca8, 0xc42029dc60, 0x7ccd2a) /gimme/.gimme/versions/go1.10.linux.amd64/src/bufio/bufio.go:216 +0x238 io.ReadAtLeast(0x14b5ae0, 0xc42056c2a0, 0xc420410118, 0x9, 0x9, 0x9, 0xc42064d6a0, 0xc42064d6a0, 0xc4209044b0) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/io.go:309 +0x86 io.ReadFull(0x14b5ae0, 0xc42056c2a0, 0xc420410118, 0x9, 0x9, 0x1432fe8, 0xc42029dd10, 0x462d33) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/io.go:327 +0x58 kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.readFrameHeader(0xc420410118, 0x9, 0x9, 0x14b5ae0, 0xc42056c2a0, 0x0, 0xc400000000, 0x7efb60, 0xc4206baf28) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/frame.go:237 +0x7b kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*Framer).ReadFrame(0xc4204100e0, 0xc4209044b0, 0x0, 0x0, 0x0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/frame.go:492 +0xa4 kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*clientConnReadLoop).run(0xc42029dfb0, 0x1431c20, 0xc4205377b0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:1428 +0x8e kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*ClientConn).readLoop(0xc420640000) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:1354 +0x76 created by kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*Transport).newClientConn /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:579 +0x651 goroutine 10804 [chan send, 3 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc420758150) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 110 [chan send, 89 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc42066d290) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 2776 [chan send, 74 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc42057d9b0) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 8425 [chan send, 15 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc420758d50) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 643 [chan send, 86 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc4204ec7e0) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 1740 [chan send, 80 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc4209057a0) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 1176 [chan send, 83 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc4206629f0) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 2236 [chan send, 78 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc420662fc0) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 3216 [chan send, 71 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc420663290) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 11336 [chan send]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc42053d7d0) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 10293 [chan send, 6 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc4205293b0) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 8948 [chan send, 12 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc42089d800) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 9393 [chan send, 9 minutes]: kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc420651e00) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:114 +0x114 created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 make: *** [functest] Error 2 + make cluster-down ./cluster/down.sh