+ export WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release + WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release + [[ windows2016-release =~ openshift-.* ]] + [[ windows2016-release =~ .*-1.10.4-.* ]] + export KUBEVIRT_PROVIDER=k8s-1.11.0 + KUBEVIRT_PROVIDER=k8s-1.11.0 + export KUBEVIRT_NUM_NODES=2 + KUBEVIRT_NUM_NODES=2 + export NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + export NAMESPACE=kube-system + NAMESPACE=kube-system + trap '{ make cluster-down; }' EXIT SIGINT SIGTERM SIGSTOP + make cluster-down ./cluster/down.sh + make cluster-up ./cluster/up.sh Downloading ....... Downloading ....... Downloading ....... 2018/08/02 16:17:58 Waiting for host: 192.168.66.101:22 2018/08/02 16:18:01 Problem with dial: dial tcp 192.168.66.101:22: getsockopt: no route to host. Sleeping 5s 2018/08/02 16:18:09 Problem with dial: dial tcp 192.168.66.101:22: getsockopt: no route to host. Sleeping 5s 2018/08/02 16:18:14 Connected to tcp://192.168.66.101:22 ++ systemctl status docker ++ grep active ++ wc -l + [[ 1 -eq 0 ]] + kubeadm init --config /etc/kubernetes/kubeadm.conf [init] using Kubernetes version: v1.11.0 [preflight] running pre-flight checks I0802 16:18:15.449464 1254 feature_gate.go:230] feature gates: &{map[]} I0802 16:18:15.550014 1254 kernel_validator.go:81] Validating kernel version I0802 16:18:15.550418 1254 kernel_validator.go:96] Validating kernel config [preflight/images] Pulling images required for setting up a Kubernetes cluster [preflight/images] This might take a minute or two, depending on the speed of your internet connection [preflight/images] You can also perform this action in beforehand using 'kubeadm config images pull' [kubelet] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env" [kubelet] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml" [preflight] Activating the kubelet service [certificates] Generated ca certificate and key. [certificates] Generated apiserver certificate and key. [certificates] apiserver serving cert is signed for DNS names [node01 kubernetes kubernetes.default kubernetes.default.svc kubernetes.default.svc.cluster.local] and IPs [10.96.0.1 192.168.66.101] [certificates] Generated apiserver-kubelet-client certificate and key. [certificates] Generated sa key and public key. [certificates] Generated front-proxy-ca certificate and key. [certificates] Generated front-proxy-client certificate and key. [certificates] Generated etcd/ca certificate and key. [certificates] Generated etcd/server certificate and key. [certificates] etcd/server serving cert is signed for DNS names [node01 localhost] and IPs [127.0.0.1 ::1] [certificates] Generated etcd/peer certificate and key. [certificates] etcd/peer serving cert is signed for DNS names [node01 localhost] and IPs [192.168.66.101 127.0.0.1 ::1] [certificates] Generated etcd/healthcheck-client certificate and key. [certificates] Generated apiserver-etcd-client certificate and key. [certificates] valid certificates and keys now exist in "/etc/kubernetes/pki" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/admin.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/kubelet.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/controller-manager.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/scheduler.conf" [controlplane] wrote Static Pod manifest for component kube-apiserver to "/etc/kubernetes/manifests/kube-apiserver.yaml" [controlplane] wrote Static Pod manifest for component kube-controller-manager to "/etc/kubernetes/manifests/kube-controller-manager.yaml" [controlplane] wrote Static Pod manifest for component kube-scheduler to "/etc/kubernetes/manifests/kube-scheduler.yaml" [etcd] Wrote Static Pod manifest for a local etcd instance to "/etc/kubernetes/manifests/etcd.yaml" [init] waiting for the kubelet to boot up the control plane as Static Pods from directory "/etc/kubernetes/manifests" [init] this might take a minute or longer if the control plane images have to be pulled [apiclient] All control plane components are healthy after 53.007227 seconds [uploadconfig] storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace [kubelet] Creating a ConfigMap "kubelet-config-1.11" in namespace kube-system with the configuration for the kubelets in the cluster [markmaster] Marking the node node01 as master by adding the label "node-role.kubernetes.io/master=''" [markmaster] Marking the node node01 as master by adding the taints [node-role.kubernetes.io/master:NoSchedule] [patchnode] Uploading the CRI Socket information "/var/run/dockershim.sock" to the Node API object "node01" as an annotation [bootstraptoken] using token: abcdef.1234567890123456 [bootstraptoken] configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials [bootstraptoken] configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token [bootstraptoken] configured RBAC rules to allow certificate rotation for all node client certificates in the cluster [bootstraptoken] creating the "cluster-info" ConfigMap in the "kube-public" namespace [addons] Applied essential addon: CoreDNS [addons] Applied essential addon: kube-proxy Your Kubernetes master has initialized successfully! To start using your cluster, you need to run the following as a regular user: mkdir -p $HOME/.kube sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config sudo chown $(id -u):$(id -g) $HOME/.kube/config You should now deploy a pod network to the cluster. Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at: https://kubernetes.io/docs/concepts/cluster-administration/addons/ You can now join any number of machines by running the following on each node as root: kubeadm join 192.168.66.101:6443 --token abcdef.1234567890123456 --discovery-token-ca-cert-hash sha256:f6c52e6634f6407ab7ff6d42934732c0fd74eae3699804300bbf0c513dcb51bd + kubectl --kubeconfig=/etc/kubernetes/admin.conf apply -f https://raw.githubusercontent.com/coreos/flannel/v0.9.1/Documentation/kube-flannel.yml clusterrole.rbac.authorization.k8s.io/flannel created clusterrolebinding.rbac.authorization.k8s.io/flannel created serviceaccount/flannel created configmap/kube-flannel-cfg created daemonset.extensions/kube-flannel-ds created + kubectl --kubeconfig=/etc/kubernetes/admin.conf taint nodes node01 node-role.kubernetes.io/master:NoSchedule- node/node01 untainted + kubectl --kubeconfig=/etc/kubernetes/admin.conf create -f /tmp/local-volume.yaml storageclass.storage.k8s.io/local created configmap/local-storage-config created clusterrolebinding.rbac.authorization.k8s.io/local-storage-provisioner-pv-binding created clusterrole.rbac.authorization.k8s.io/local-storage-provisioner-node-clusterrole created clusterrolebinding.rbac.authorization.k8s.io/local-storage-provisioner-node-binding created role.rbac.authorization.k8s.io/local-storage-provisioner-jobs-role created rolebinding.rbac.authorization.k8s.io/local-storage-provisioner-jobs-rolebinding created serviceaccount/local-storage-admin created daemonset.extensions/local-volume-provisioner created 2018/08/02 16:19:25 Waiting for host: 192.168.66.102:22 2018/08/02 16:19:28 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: no route to host. Sleeping 5s 2018/08/02 16:19:36 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: no route to host. Sleeping 5s 2018/08/02 16:19:41 Connected to tcp://192.168.66.102:22 ++ systemctl status docker ++ grep active ++ wc -l + [[ 0 -eq 0 ]] + sleep 2 ++ systemctl status docker ++ grep active ++ wc -l + [[ 1 -eq 0 ]] + kubeadm join --token abcdef.1234567890123456 192.168.66.101:6443 --ignore-preflight-errors=all --discovery-token-unsafe-skip-ca-verification=true [preflight] running pre-flight checks [WARNING RequiredIPVSKernelModulesAvailable]: the IPVS proxier will not be used, because the following required kernel modules are not loaded: [ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh] or no builtin kernel ipvs support: map[ip_vs_rr:{} ip_vs_wrr:{} ip_vs_sh:{} nf_conntrack_ipv4:{} ip_vs:{}] you can solve this problem with following methods: 1. Run 'modprobe -- ' to load missing kernel modules; 2. Provide the missing builtin kernel ipvs support I0802 16:19:44.003006 1264 kernel_validator.go:81] Validating kernel version I0802 16:19:44.003392 1264 kernel_validator.go:96] Validating kernel config [discovery] Trying to connect to API Server "192.168.66.101:6443" [discovery] Created cluster-info discovery client, requesting info from "https://192.168.66.101:6443" [discovery] Cluster info signature and contents are valid and no TLS pinning was specified, will use API Server "192.168.66.101:6443" [discovery] Successfully established connection with API Server "192.168.66.101:6443" [kubelet] Downloading configuration for the kubelet from the "kubelet-config-1.11" ConfigMap in the kube-system namespace [kubelet] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml" [kubelet] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env" [preflight] Activating the kubelet service [tlsbootstrap] Waiting for the kubelet to perform the TLS Bootstrap... [patchnode] Uploading the CRI Socket information "/var/run/dockershim.sock" to the Node API object "node02" as an annotation This node has joined the cluster: * Certificate signing request was sent to master and a response was received. * The Kubelet was informed of the new secure connection details. Run 'kubectl get nodes' on the master to see this node join the cluster. Sending file modes: C0755 38739968 kubectl Sending file modes: C0600 5454 admin.conf Cluster "kubernetes" set. Cluster "kubernetes" set. + set +e + kubectl get nodes --no-headers + cluster/kubectl.sh get nodes --no-headers node01 Ready master 53s v1.11.0 node02 Ready 22s v1.11.0 + kubectl_rc=0 + '[' 0 -ne 0 ']' ++ kubectl get nodes --no-headers ++ cluster/kubectl.sh get nodes --no-headers ++ grep NotReady + '[' -n '' ']' + set -e + echo 'Nodes are ready:' Nodes are ready: + kubectl get nodes + cluster/kubectl.sh get nodes NAME STATUS ROLES AGE VERSION node01 Ready master 54s v1.11.0 node02 Ready 23s v1.11.0 + make cluster-sync ./cluster/build.sh Building ... Untagged: localhost:33211/kubevirt/virt-controller:devel Untagged: localhost:33211/kubevirt/virt-controller@sha256:a01abe5aea99049dbb2bdc15cbb884aba70b6ff4419401bf26562a9e5c2d0bb3 Untagged: localhost:33195/kubevirt/virt-launcher:devel Untagged: localhost:33195/kubevirt/virt-launcher@sha256:e0391f2d37069e6a63f169addff583e119e1c4d78b2bf419a096b1b2ecaaf756 Untagged: localhost:33195/kubevirt/virt-handler:devel Untagged: localhost:33195/kubevirt/virt-handler@sha256:348a89ffd825cde3aca08da251bc9f7754d02b023ea4d591fad48383742c25ea Untagged: localhost:33195/kubevirt/virt-api:devel Untagged: localhost:33195/kubevirt/virt-api@sha256:71716e9f1387c612ad8ca861c579617aa69884dc0791b111e5eb50938b8fcb1e Untagged: localhost:33211/kubevirt/subresource-access-test:devel Untagged: localhost:33211/kubevirt/subresource-access-test@sha256:24b41c0186dbb5c297086530fb75db00e1ba2cb7acf98c21fb6c6828555f3e7b Untagged: localhost:33195/kubevirt/example-hook-sidecar:devel Untagged: localhost:33195/kubevirt/example-hook-sidecar@sha256:df65ae765fcda68a6eb08664b8e3872e4515ba44931371ec44e8d0c8aba7aa4a sha256:b69a3f94b2043cd36cc41eb5d9446480e0a640962e468ab72c3cc51f2b89386a go version go1.10 linux/amd64 go version go1.10 linux/amd64 make[1]: Entering directory `/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt' hack/dockerized "./hack/check.sh && KUBEVIRT_VERSION= ./hack/build-go.sh install " && ./hack/build-copy-artifacts.sh sha256:b69a3f94b2043cd36cc41eb5d9446480e0a640962e468ab72c3cc51f2b89386a go version go1.10 linux/amd64 go version go1.10 linux/amd64 find: '/root/go/src/kubevirt.io/kubevirt/_out/cmd': No such file or directory Compiling tests... compiled tests.test hack/build-docker.sh build Sending build context to Docker daemon 40.39 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 3265a3c6f899 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-controller ---> Using cache ---> 84570f0bf244 Step 4/8 : WORKDIR /home/virt-controller ---> Using cache ---> 4b8efcbf3461 Step 5/8 : USER 1001 ---> Using cache ---> c49257f2ff48 Step 6/8 : COPY virt-controller /usr/bin/virt-controller ---> 54913d66352c Removing intermediate container 95fe86388e58 Step 7/8 : ENTRYPOINT /usr/bin/virt-controller ---> Running in 19fdc78ec5cd ---> e51b5a290844 Removing intermediate container 19fdc78ec5cd Step 8/8 : LABEL "kubevirt-functional-tests-windows2016-release1" '' "virt-controller" '' ---> Running in 2c419b766e64 ---> b5eea128b38f Removing intermediate container 2c419b766e64 Successfully built b5eea128b38f Sending build context to Docker daemon 43.31 MB Step 1/10 : FROM kubevirt/libvirt:4.2.0 ---> 5f0bfe81a3e0 Step 2/10 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> c1e65e6c8241 Step 3/10 : RUN dnf -y install socat genisoimage util-linux libcgroup-tools ethtool net-tools sudo && dnf -y clean all && test $(id -u qemu) = 107 # make sure that the qemu user really is 107 ---> Using cache ---> 4c20d196c128 Step 4/10 : COPY virt-launcher /usr/bin/virt-launcher ---> e50f53b0dad0 Removing intermediate container 0b3036a43683 Step 5/10 : COPY kubevirt-sudo /etc/sudoers.d/kubevirt ---> 6ee128d32396 Removing intermediate container 4efed358e2c3 Step 6/10 : RUN setcap CAP_NET_BIND_SERVICE=+eip /usr/bin/qemu-system-x86_64 ---> Running in 12dba8da8364  ---> 317d4af5ed5f Removing intermediate container 12dba8da8364 Step 7/10 : RUN mkdir -p /usr/share/kubevirt/virt-launcher ---> Running in d4e2b1028b90  ---> 10588cc35489 Removing intermediate container d4e2b1028b90 Step 8/10 : COPY entrypoint.sh libvirtd.sh sock-connector /usr/share/kubevirt/virt-launcher/ ---> cee7477896bb Removing intermediate container 7c580995431d Step 9/10 : ENTRYPOINT /usr/share/kubevirt/virt-launcher/entrypoint.sh ---> Running in 14c3414e43fe ---> ede529188af4 Removing intermediate container 14c3414e43fe Step 10/10 : LABEL "kubevirt-functional-tests-windows2016-release1" '' "virt-launcher" '' ---> Running in 8b48f167b18f ---> 8a237fc7820d Removing intermediate container 8b48f167b18f Successfully built 8a237fc7820d Sending build context to Docker daemon 38.4 MB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 3265a3c6f899 Step 3/5 : COPY virt-handler /usr/bin/virt-handler ---> 35d02b829a4a Removing intermediate container 279bb2f28613 Step 4/5 : ENTRYPOINT /usr/bin/virt-handler ---> Running in db0ca365e416 ---> a883ea70ee23 Removing intermediate container db0ca365e416 Step 5/5 : LABEL "kubevirt-functional-tests-windows2016-release1" '' "virt-handler" '' ---> Running in 9ebc7c5bd09a ---> 436f3dfe6622 Removing intermediate container 9ebc7c5bd09a Successfully built 436f3dfe6622 Sending build context to Docker daemon 38.81 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 3265a3c6f899 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-api ---> Using cache ---> 6f2134b876af Step 4/8 : WORKDIR /home/virt-api ---> Using cache ---> d5ef0239bf68 Step 5/8 : USER 1001 ---> Using cache ---> 233000b2d9b5 Step 6/8 : COPY virt-api /usr/bin/virt-api ---> 5eaffc07cdaf Removing intermediate container c7e37eb8373d Step 7/8 : ENTRYPOINT /usr/bin/virt-api ---> Running in cae3e563244e ---> e1fca7561f11 Removing intermediate container cae3e563244e Step 8/8 : LABEL "kubevirt-functional-tests-windows2016-release1" '' "virt-api" '' ---> Running in 6892870d23a0 ---> 7f1eba1ca7d3 Removing intermediate container 6892870d23a0 Successfully built 7f1eba1ca7d3 Sending build context to Docker daemon 4.096 kB Step 1/7 : FROM fedora:28 ---> cc510acfcd70 Step 2/7 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 3265a3c6f899 Step 3/7 : ENV container docker ---> Using cache ---> 3fe7db912524 Step 4/7 : RUN mkdir -p /images/custom /images/alpine && truncate -s 64M /images/custom/disk.img && curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /images/alpine/disk.img ---> Using cache ---> 06d762a67408 Step 5/7 : ADD entrypoint.sh / ---> Using cache ---> 3876d185cf84 Step 6/7 : CMD /entrypoint.sh ---> Using cache ---> 1fb50ce9b78f Step 7/7 : LABEL "disks-images-provider" '' "kubevirt-functional-tests-windows2016-release1" '' ---> Using cache ---> ad0640d6a94a Successfully built ad0640d6a94a Sending build context to Docker daemon 2.56 kB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 3265a3c6f899 Step 3/5 : ENV container docker ---> Using cache ---> 3fe7db912524 Step 4/5 : RUN dnf -y install procps-ng nmap-ncat && dnf -y clean all ---> Using cache ---> 6bc4f549313f Step 5/5 : LABEL "kubevirt-functional-tests-windows2016-release1" '' "vm-killer" '' ---> Using cache ---> d1936042d584 Successfully built d1936042d584 Sending build context to Docker daemon 5.12 kB Step 1/7 : FROM debian:sid ---> 68f33cf86aab Step 2/7 : MAINTAINER "David Vossel" \ ---> Using cache ---> 9ef1c0ce5d24 Step 3/7 : ENV container docker ---> Using cache ---> 9ad55e41ed61 Step 4/7 : RUN apt-get update && apt-get install -y bash curl bzip2 qemu-utils && mkdir -p /disk && rm -rf /var/lib/apt/lists/* ---> Using cache ---> 17a81fda7c2b Step 5/7 : ADD entry-point.sh / ---> Using cache ---> 681d01e165e6 Step 6/7 : CMD /entry-point.sh ---> Using cache ---> a79815fe82d9 Step 7/7 : LABEL "kubevirt-functional-tests-windows2016-release1" '' "registry-disk-v1alpha" '' ---> Using cache ---> 6ef2fe0ba069 Successfully built 6ef2fe0ba069 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33290/kubevirt/registry-disk-v1alpha:devel ---> 6ef2fe0ba069 Step 2/4 : MAINTAINER "David Vossel" \ ---> Using cache ---> 01615351ca4e Step 3/4 : RUN curl https://download.cirros-cloud.net/0.4.0/cirros-0.4.0-x86_64-disk.img > /disk/cirros.img ---> Using cache ---> 81ca76c46679 Step 4/4 : LABEL "cirros-registry-disk-demo" '' "kubevirt-functional-tests-windows2016-release1" '' ---> Using cache ---> c448af5e3322 Successfully built c448af5e3322 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33290/kubevirt/registry-disk-v1alpha:devel ---> 6ef2fe0ba069 Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> d330eefdd757 Step 3/4 : RUN curl -g -L https://download.fedoraproject.org/pub/fedora/linux/releases/27/CloudImages/x86_64/images/Fedora-Cloud-Base-27-1.6.x86_64.qcow2 > /disk/fedora.qcow2 ---> Using cache ---> d4f7cb7b1be2 Step 4/4 : LABEL "fedora-cloud-registry-disk-demo" '' "kubevirt-functional-tests-windows2016-release1" '' ---> Using cache ---> c74218398637 Successfully built c74218398637 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33290/kubevirt/registry-disk-v1alpha:devel ---> 6ef2fe0ba069 Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> d330eefdd757 Step 3/4 : RUN curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /disk/alpine.iso ---> Using cache ---> 3696cd7aa2d3 Step 4/4 : LABEL "alpine-registry-disk-demo" '' "kubevirt-functional-tests-windows2016-release1" '' ---> Using cache ---> c5b23ac9de78 Successfully built c5b23ac9de78 Sending build context to Docker daemon 35.59 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 3265a3c6f899 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virtctl ---> Using cache ---> deebe9dc06da Step 4/8 : WORKDIR /home/virtctl ---> Using cache ---> 4094ce77e412 Step 5/8 : USER 1001 ---> Using cache ---> ba694520e9a4 Step 6/8 : COPY subresource-access-test /subresource-access-test ---> b70a8d38b0cc Removing intermediate container 00d205f76d55 Step 7/8 : ENTRYPOINT /subresource-access-test ---> Running in 78420cf38f71 ---> 01f38b5e15da Removing intermediate container 78420cf38f71 Step 8/8 : LABEL "kubevirt-functional-tests-windows2016-release1" '' "subresource-access-test" '' ---> Running in 4ffb9d1bfeb4 ---> 587ce1128784 Removing intermediate container 4ffb9d1bfeb4 Successfully built 587ce1128784 Sending build context to Docker daemon 3.072 kB Step 1/9 : FROM fedora:28 ---> cc510acfcd70 Step 2/9 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 3265a3c6f899 Step 3/9 : ENV container docker ---> Using cache ---> 3fe7db912524 Step 4/9 : RUN dnf -y install make git gcc && dnf -y clean all ---> Using cache ---> e0cf52293e57 Step 5/9 : ENV GIMME_GO_VERSION 1.9.2 ---> Using cache ---> 8c031086e8cb Step 6/9 : RUN mkdir -p /gimme && curl -sL https://raw.githubusercontent.com/travis-ci/gimme/master/gimme | HOME=/gimme bash >> /etc/profile.d/gimme.sh ---> Using cache ---> 0f6dd31de4d3 Step 7/9 : ENV GOPATH "/go" GOBIN "/usr/bin" ---> Using cache ---> 6a702eb79a95 Step 8/9 : RUN mkdir -p /go && source /etc/profile.d/gimme.sh && go get github.com/masterzen/winrm-cli ---> Using cache ---> bed79012c9f3 Step 9/9 : LABEL "kubevirt-functional-tests-windows2016-release1" '' "winrmcli" '' ---> Using cache ---> dd5c5c7f0ce2 Successfully built dd5c5c7f0ce2 Sending build context to Docker daemon 36.8 MB Step 1/5 : FROM fedora:27 ---> 9110ae7f579f Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> cc296a71da13 Step 3/5 : COPY example-hook-sidecar /example-hook-sidecar ---> e709a2872b27 Removing intermediate container 519f0176d93c Step 4/5 : ENTRYPOINT /example-hook-sidecar ---> Running in 5b2d953d6f7b ---> 11390bb816a7 Removing intermediate container 5b2d953d6f7b Step 5/5 : LABEL "example-hook-sidecar" '' "kubevirt-functional-tests-windows2016-release1" '' ---> Running in 5c99e647bf19 ---> 33e106e0ab20 Removing intermediate container 5c99e647bf19 Successfully built 33e106e0ab20 hack/build-docker.sh push The push refers to a repository [localhost:33290/kubevirt/virt-controller] 47cbab61c19c: Preparing 915a0c3e3f5f: Preparing 891e1e4ef82a: Preparing 915a0c3e3f5f: Pushed 47cbab61c19c: Pushed 891e1e4ef82a: Pushed devel: digest: sha256:4237c2f27b61ebcb46cd719ef10886490c716caaa019eef7c61bf16c154210d6 size: 949 The push refers to a repository [localhost:33290/kubevirt/virt-launcher] 6935fd5626aa: Preparing 5476406f0e10: Preparing 79f699dd2c43: Preparing 673c2091a64b: Preparing 92105bb835ac: Preparing 5379fb5d8cce: Preparing da38cf808aa5: Preparing b83399358a92: Preparing 186d8b3e4fd8: Preparing fa6154170bf5: Preparing 5eefb9960a36: Preparing 891e1e4ef82a: Preparing fa6154170bf5: Waiting 5eefb9960a36: Waiting 5379fb5d8cce: Waiting 891e1e4ef82a: Waiting 186d8b3e4fd8: Waiting da38cf808aa5: Waiting b83399358a92: Waiting 673c2091a64b: Pushed 6935fd5626aa: Pushed 5476406f0e10: Pushed da38cf808aa5: Pushed b83399358a92: Pushed fa6154170bf5: Pushed 186d8b3e4fd8: Pushed 891e1e4ef82a: Mounted from kubevirt/virt-controller 79f699dd2c43: Pushed 5379fb5d8cce: Pushed 92105bb835ac: Pushed 5eefb9960a36: Pushed devel: digest: sha256:21624f23558776cdfcfbecdf34374f1860052ac9ab0e28c7388276b53e624a67 size: 2828 The push refers to a repository [localhost:33290/kubevirt/virt-handler] d51e4b064b6f: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-launcher d51e4b064b6f: Pushed devel: digest: sha256:1482b8bb3390c5be9809455977dbc09606bec67c043fa139e80819d7d1038e53 size: 740 The push refers to a repository [localhost:33290/kubevirt/virt-api] 576518f43b03: Preparing 7cc07c574d2a: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-handler 7cc07c574d2a: Pushed 576518f43b03: Pushed devel: digest: sha256:732d97c18e313ed58f95e7e55746debe054c53c05c5b090548273f78149ba4d7 size: 948 The push refers to a repository [localhost:33290/kubevirt/disks-images-provider] 1548fa7b1c9e: Preparing a7621d2cf364: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-api 1548fa7b1c9e: Pushed a7621d2cf364: Pushed devel: digest: sha256:d273f6da472de0e04913d3468b8efabc603cdb07dec7d2ff3559414c226fceef size: 948 The push refers to a repository [localhost:33290/kubevirt/vm-killer] 3c31f9f8d755: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/disks-images-provider 3c31f9f8d755: Pushed devel: digest: sha256:a6dc30b5b25246ac485e30d2aaee9c098c1e68191ce390cce3f0d8d4e1ad9328 size: 740 The push refers to a repository [localhost:33290/kubevirt/registry-disk-v1alpha] c66b9a220e25: Preparing 4662bbc21c2d: Preparing 25edbec0eaea: Preparing c66b9a220e25: Pushed 4662bbc21c2d: Pushed 25edbec0eaea: Pushed devel: digest: sha256:983fa47e2a9f84477bd28f2f1c36f24812001a833dca5b4ae9a4d436a2d2564c size: 948 The push refers to a repository [localhost:33290/kubevirt/cirros-registry-disk-demo] 8081bd2f2d51: Preparing c66b9a220e25: Preparing 4662bbc21c2d: Preparing 25edbec0eaea: Preparing c66b9a220e25: Mounted from kubevirt/registry-disk-v1alpha 4662bbc21c2d: Mounted from kubevirt/registry-disk-v1alpha 25edbec0eaea: Mounted from kubevirt/registry-disk-v1alpha 8081bd2f2d51: Pushed devel: digest: sha256:90cff06e4e356cc860429e715d7eb65570de321773a692851fd7888f39a0e2b0 size: 1160 The push refers to a repository [localhost:33290/kubevirt/fedora-cloud-registry-disk-demo] fa1881d7bf95: Preparing c66b9a220e25: Preparing 4662bbc21c2d: Preparing 25edbec0eaea: Preparing c66b9a220e25: Mounted from kubevirt/cirros-registry-disk-demo 4662bbc21c2d: Mounted from kubevirt/cirros-registry-disk-demo 25edbec0eaea: Mounted from kubevirt/cirros-registry-disk-demo fa1881d7bf95: Pushed devel: digest: sha256:18c2e2f569079fd2da55a2eb87240fe29015c8fbf293d125557e82dfb55a4cf0 size: 1161 The push refers to a repository [localhost:33290/kubevirt/alpine-registry-disk-demo] d01c36937189: Preparing c66b9a220e25: Preparing 4662bbc21c2d: Preparing 25edbec0eaea: Preparing c66b9a220e25: Mounted from kubevirt/fedora-cloud-registry-disk-demo 4662bbc21c2d: Mounted from kubevirt/fedora-cloud-registry-disk-demo 25edbec0eaea: Mounted from kubevirt/fedora-cloud-registry-disk-demo d01c36937189: Pushed devel: digest: sha256:994d447b46abde194e1d6610f761887b06c5a3b57c80e1807cdc6138f0d20f15 size: 1160 The push refers to a repository [localhost:33290/kubevirt/subresource-access-test] 364ffbe9dd36: Preparing 7e69243e781e: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/vm-killer 7e69243e781e: Pushed 364ffbe9dd36: Pushed devel: digest: sha256:c263c0c25f0405db9d7c228290ae28c92368bae07242bebcae540087f2a52843 size: 948 The push refers to a repository [localhost:33290/kubevirt/winrmcli] a117c61a5658: Preparing c9df4405017d: Preparing 99bb32247f65: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/subresource-access-test a117c61a5658: Pushed 99bb32247f65: Pushed c9df4405017d: Pushed devel: digest: sha256:8f5e5fefe668fada12ea95ecdc2c5e3a1055bf6b924ed23056d8ab448a09b6f8 size: 1165 The push refers to a repository [localhost:33290/kubevirt/example-hook-sidecar] 4941c14c4c12: Preparing 39bae602f753: Preparing 4941c14c4c12: Pushed 39bae602f753: Pushed devel: digest: sha256:2bb5f8313ccfa4e251c1209a198cc633b45d98288b24e44ef7879d1bafa25535 size: 740 make[1]: Leaving directory `/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt' Done ./cluster/clean.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.11.0 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-windows2016-release ']' ++ provider_prefix=kubevirt-functional-tests-windows2016-release1 ++ job_prefix=kubevirt-functional-tests-windows2016-release1 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-182-g4c4116d ++ KUBEVIRT_VERSION=v0.7.0-182-g4c4116d + source cluster/k8s-1.11.0/provider.sh ++ set -e ++ image=k8s-1.11.0@sha256:6c1caf5559eb02a144bf606de37eb0194c06ace4d77ad4561459f3bde876151c ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.11.0 ++ source hack/config-default.sh source hack/config-k8s-1.11.0.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.11.0.sh ++ source hack/config-provider-k8s-1.11.0.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.11.0/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.11.0/.kubectl +++ docker_prefix=localhost:33290/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Cleaning up ...' Cleaning up ... + cluster/kubectl.sh get vmis --all-namespaces -o=custom-columns=NAME:.metadata.name,NAMESPACE:.metadata.namespace,FINALIZERS:.metadata.finalizers --no-headers + grep foregroundDeleteVirtualMachine + read p error: the server doesn't have a resource type "vmis" + _kubectl delete ds -l kubevirt.io -n kube-system --cascade=false --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=libvirt --force --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=virt-handler --force --grace-period 0 No resources found + namespaces=(default ${namespace}) + for i in '${namespaces[@]}' + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete deployment -l kubevirt.io No resources found + _kubectl -n default delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete rs -l kubevirt.io No resources found + _kubectl -n default delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete services -l kubevirt.io No resources found + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n default delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete secrets -l kubevirt.io No resources found + _kubectl -n default delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete pv -l kubevirt.io No resources found + _kubectl -n default delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete pvc -l kubevirt.io No resources found + _kubectl -n default delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete ds -l kubevirt.io No resources found + _kubectl -n default delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n default delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete pods -l kubevirt.io No resources found + _kubectl -n default delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n default delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete rolebinding -l kubevirt.io No resources found + _kubectl -n default delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete roles -l kubevirt.io No resources found + _kubectl -n default delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete clusterroles -l kubevirt.io No resources found + _kubectl -n default delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n default get crd offlinevirtualmachines.kubevirt.io ++ wc -l ++ export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig ++ cluster/k8s-1.11.0/.kubectl -n default get crd offlinevirtualmachines.kubevirt.io No resources found. Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + for i in '${namespaces[@]}' + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete deployment -l kubevirt.io No resources found + _kubectl -n kube-system delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete rs -l kubevirt.io No resources found + _kubectl -n kube-system delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete services -l kubevirt.io No resources found + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n kube-system delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete secrets -l kubevirt.io No resources found + _kubectl -n kube-system delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete pv -l kubevirt.io No resources found + _kubectl -n kube-system delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete pvc -l kubevirt.io No resources found + _kubectl -n kube-system delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete ds -l kubevirt.io No resources found + _kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n kube-system delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete pods -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete rolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete roles -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete clusterroles -l kubevirt.io No resources found + _kubectl -n kube-system delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io ++ wc -l ++ export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig ++ cluster/k8s-1.11.0/.kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io No resources found. Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + sleep 2 + echo Done Done ./cluster/deploy.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.11.0 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-windows2016-release ']' ++ provider_prefix=kubevirt-functional-tests-windows2016-release1 ++ job_prefix=kubevirt-functional-tests-windows2016-release1 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-182-g4c4116d ++ KUBEVIRT_VERSION=v0.7.0-182-g4c4116d + source cluster/k8s-1.11.0/provider.sh ++ set -e ++ image=k8s-1.11.0@sha256:6c1caf5559eb02a144bf606de37eb0194c06ace4d77ad4561459f3bde876151c ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.11.0 ++ source hack/config-default.sh source hack/config-k8s-1.11.0.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.11.0.sh ++ source hack/config-provider-k8s-1.11.0.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.11.0/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.11.0/.kubectl +++ docker_prefix=localhost:33290/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Deploying ...' Deploying ... + [[ -z windows2016-release ]] + [[ windows2016-release =~ .*-dev ]] + [[ windows2016-release =~ .*-release ]] + for manifest in '${MANIFESTS_OUT_DIR}/release/*' + [[ /var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/demo-content.yaml =~ .*demo.* ]] + continue + for manifest in '${MANIFESTS_OUT_DIR}/release/*' + [[ /var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml =~ .*demo.* ]] + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml clusterrole.rbac.authorization.k8s.io/kubevirt.io:admin created clusterrole.rbac.authorization.k8s.io/kubevirt.io:edit created clusterrole.rbac.authorization.k8s.io/kubevirt.io:view created serviceaccount/kubevirt-apiserver created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-apiserver created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-apiserver-auth-delegator created rolebinding.rbac.authorization.k8s.io/kubevirt-apiserver created role.rbac.authorization.k8s.io/kubevirt-apiserver created clusterrole.rbac.authorization.k8s.io/kubevirt-apiserver created clusterrole.rbac.authorization.k8s.io/kubevirt-controller created serviceaccount/kubevirt-controller created serviceaccount/kubevirt-privileged created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-controller created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-controller-cluster-admin created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-privileged-cluster-admin created clusterrole.rbac.authorization.k8s.io/kubevirt.io:default created clusterrolebinding.rbac.authorization.k8s.io/kubevirt.io:default created service/virt-api created deployment.extensions/virt-api created deployment.extensions/virt-controller created daemonset.extensions/virt-handler created customresourcedefinition.apiextensions.k8s.io/virtualmachineinstances.kubevirt.io created customresourcedefinition.apiextensions.k8s.io/virtualmachineinstancereplicasets.kubevirt.io created customresourcedefinition.apiextensions.k8s.io/virtualmachineinstancepresets.kubevirt.io created customresourcedefinition.apiextensions.k8s.io/virtualmachines.kubevirt.io created + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R persistentvolumeclaim/disk-alpine created persistentvolume/host-path-disk-alpine created persistentvolumeclaim/disk-custom created persistentvolume/host-path-disk-custom created daemonset.extensions/disks-images-provider created serviceaccount/kubevirt-testing created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-testing-cluster-admin created + [[ k8s-1.11.0 =~ os-* ]] + echo Done Done + namespaces=(kube-system default) + [[ kube-system != \k\u\b\e\-\s\y\s\t\e\m ]] + timeout=300 + sample=30 + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n kube-system --no-headers ++ cluster/kubectl.sh get pods -n kube-system --no-headers ++ grep -v Running + '[' -n 'virt-api-bcc6b587d-vm5c8 0/1 ContainerCreating 0 2s virt-api-bcc6b587d-wc67b 0/1 ContainerCreating 0 3s virt-controller-67dcdd8464-8bbvq 0/1 ContainerCreating 0 3s virt-controller-67dcdd8464-grg2b 0/1 ContainerCreating 0 2s virt-handler-7tpdp 0/1 ContainerCreating 0 2s virt-handler-mr9b9 0/1 ContainerCreating 0 3s' ']' + echo 'Waiting for kubevirt pods to enter the Running state ...' Waiting for kubevirt pods to enter the Running state ... + kubectl get pods -n kube-system --no-headers + cluster/kubectl.sh get pods -n kube-system --no-headers + grep -v Running disks-images-provider-4dk74 0/1 Pending 0 1s disks-images-provider-h89ml 0/1 Pending 0 1s virt-api-bcc6b587d-vm5c8 0/1 ContainerCreating 0 3s virt-api-bcc6b587d-wc67b 0/1 ContainerCreating 0 4s virt-controller-67dcdd8464-8bbvq 0/1 ContainerCreating 0 4s virt-controller-67dcdd8464-grg2b 0/1 ContainerCreating 0 3s virt-handler-7tpdp 0/1 ContainerCreating 0 3s virt-handler-mr9b9 0/1 ContainerCreating 0 4s + sleep 30 + current_time=30 + '[' 30 -gt 300 ']' ++ kubectl get pods -n kube-system --no-headers ++ cluster/kubectl.sh get pods -n kube-system --no-headers ++ grep -v Running + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n false ']' + echo 'Waiting for KubeVirt containers to become ready ...' Waiting for KubeVirt containers to become ready ... + kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + grep false + cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers false + sleep 30 + current_time=30 + '[' 30 -gt 300 ']' ++ kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false + '[' -n '' ']' + kubectl get pods -n kube-system + cluster/kubectl.sh get pods -n kube-system NAME READY STATUS RESTARTS AGE coredns-78fcdf6894-c7wdv 1/1 Running 0 15m coredns-78fcdf6894-x84sr 1/1 Running 0 15m disks-images-provider-4dk74 1/1 Running 0 1m disks-images-provider-h89ml 1/1 Running 0 1m etcd-node01 1/1 Running 0 15m kube-apiserver-node01 1/1 Running 0 15m kube-controller-manager-node01 1/1 Running 0 14m kube-flannel-ds-72kmp 1/1 Running 0 15m kube-flannel-ds-ghjg8 1/1 Running 0 15m kube-proxy-bqrsx 1/1 Running 0 15m kube-proxy-vhr87 1/1 Running 0 15m kube-scheduler-node01 1/1 Running 0 15m virt-api-bcc6b587d-vm5c8 1/1 Running 0 1m virt-api-bcc6b587d-wc67b 1/1 Running 0 1m virt-controller-67dcdd8464-8bbvq 1/1 Running 0 1m virt-controller-67dcdd8464-grg2b 1/1 Running 0 1m virt-handler-7tpdp 1/1 Running 0 1m virt-handler-mr9b9 1/1 Running 0 1m + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n default --no-headers ++ cluster/kubectl.sh get pods -n default --no-headers ++ grep -v Running + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n '' ']' + kubectl get pods -n default + cluster/kubectl.sh get pods -n default NAME READY STATUS RESTARTS AGE local-volume-provisioner-752hr 1/1 Running 0 15m local-volume-provisioner-v4qmb 1/1 Running 0 15m + kubectl version + cluster/kubectl.sh version Client Version: version.Info{Major:"1", Minor:"11", GitVersion:"v1.11.0", GitCommit:"91e7b4fd31fcd3d5f436da26c980becec37ceefe", GitTreeState:"clean", BuildDate:"2018-06-27T20:17:28Z", GoVersion:"go1.10.2", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"11", GitVersion:"v1.11.0", GitCommit:"91e7b4fd31fcd3d5f436da26c980becec37ceefe", GitTreeState:"clean", BuildDate:"2018-06-27T20:08:34Z", GoVersion:"go1.10.2", Compiler:"gc", Platform:"linux/amd64"} + ginko_params='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/junit.xml' + [[ windows2016-release =~ windows.* ]] + [[ -d /home/nfs/images/windows2016 ]] + kubectl create -f - + cluster/kubectl.sh create -f - persistentvolume/disk-windows created + ginko_params='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/junit.xml --ginkgo.focus=Windows' + FUNC_TEST_ARGS='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-windows2016-release/junit.xml --ginkgo.focus=Windows' + make functest hack/dockerized "hack/build-func-tests.sh" sha256:b69a3f94b2043cd36cc41eb5d9446480e0a640962e468ab72c3cc51f2b89386a go version go1.10 linux/amd64 go version go1.10 linux/amd64 Compiling tests... compiled tests.test hack/functests.sh Running Suite: Tests Suite ========================== Random Seed: 1533227749 Will run 6 of 148 specs SSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSS Pod name: disks-images-provider-4dk74 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-h89ml Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-vm5c8 Pod phase: Running 2018/08/02 16:34:36 http: TLS handshake error from 10.244.0.1:50266: EOF 2018/08/02 16:34:46 http: TLS handshake error from 10.244.0.1:50328: EOF 2018/08/02 16:34:56 http: TLS handshake error from 10.244.0.1:50430: EOF 2018/08/02 16:35:06 http: TLS handshake error from 10.244.0.1:50490: EOF 2018/08/02 16:35:16 http: TLS handshake error from 10.244.0.1:50570: EOF 2018/08/02 16:35:26 http: TLS handshake error from 10.244.0.1:50640: EOF 2018/08/02 16:35:36 http: TLS handshake error from 10.244.0.1:50726: EOF 2018/08/02 16:35:46 http: TLS handshake error from 10.244.0.1:50786: EOF 2018/08/02 16:35:56 http: TLS handshake error from 10.244.0.1:50848: EOF 2018/08/02 16:36:06 http: TLS handshake error from 10.244.0.1:50908: EOF Pod name: virt-api-bcc6b587d-wc67b Pod phase: Running 2018/08/02 16:35:17 http: TLS handshake error from 10.244.1.1:50664: EOF level=info timestamp=2018-08-02T16:35:18.959235Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:35:26.213825Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:35:27 http: TLS handshake error from 10.244.1.1:50692: EOF 2018/08/02 16:35:37 http: TLS handshake error from 10.244.1.1:50712: EOF level=info timestamp=2018-08-02T16:35:41.780678Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:35:42.388092Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:35:45.430539Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T16:35:45.446172Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T16:35:46.755352Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:35:47 http: TLS handshake error from 10.244.1.1:50718: EOF level=info timestamp=2018-08-02T16:35:49.015826Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:35:56.277199Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:35:57 http: TLS handshake error from 10.244.1.1:50724: EOF 2018/08/02 16:36:07 http: TLS handshake error from 10.244.1.1:50742: EOF Pod name: virt-controller-67dcdd8464-8bbvq Pod phase: Running level=info timestamp=2018-08-02T16:34:11.031831Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-08-02T16:34:11.031887Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-08-02T16:34:11.031910Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-08-02T16:34:11.031927Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer limitrangeInformer" level=info timestamp=2018-08-02T16:34:11.031944Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-08-02T16:34:11.031961Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-08-02T16:34:11.031977Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-08-02T16:34:11.032037Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-08-02T16:34:11.032084Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-02T16:34:11.046148Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T16:34:11.046331Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T16:34:11.046395Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T16:34:11.046454Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T16:35:50.633047Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:35:50.636120Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-grg2b Pod phase: Running level=info timestamp=2018-08-02T16:34:12.150101Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-7tpdp Pod phase: Running level=info timestamp=2018-08-02T16:36:08.336881Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T16:36:08.340922Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T16:36:08.341166Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi2hqgc" level=info timestamp=2018-08-02T16:36:08.421440Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmi2hqgc, existing: true\n" level=info timestamp=2018-08-02T16:36:08.421525Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T16:36:08.421553Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:36:08.421797Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T16:36:08.424471Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T16:36:08.424663Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi2hqgc" level=info timestamp=2018-08-02T16:36:08.585150Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmi2hqgc, existing: true\n" level=info timestamp=2018-08-02T16:36:08.585226Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T16:36:08.585254Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:36:08.585333Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T16:36:08.592408Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T16:36:08.592832Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi2hqgc" Pod name: virt-handler-mr9b9 Pod phase: Running level=info timestamp=2018-08-02T16:34:13.916236Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-02T16:34:13.931174Z pos=vm.go:211 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-02T16:34:13.933143Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" Pod name: virt-launcher-testvmi2hqgc-886tm Pod phase: Running level=info timestamp=2018-08-02T16:36:08.100952Z pos=dhcp.go:62 component=virt-launcher msg="Starting SingleClientDHCPServer" level=error timestamp=2018-08-02T16:36:08.192380Z pos=libvirt_helper.go:97 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" msg="Defining the VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:36:08.192742Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:36:08.257746Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:36:08.257831Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:36:08.262261Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:36:08.262337Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:36:08.295542Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:36:08.295707Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:36:08.340123Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:36:08.340215Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:36:08.424153Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:36:08.424244Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:36:08.591376Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:36:08.591463Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" Pod name: disks-images-provider-4dk74 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-h89ml Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-vm5c8 Pod phase: Running 2018/08/02 16:39:26 http: TLS handshake error from 10.244.0.1:52124: EOF 2018/08/02 16:39:36 http: TLS handshake error from 10.244.0.1:52184: EOF 2018/08/02 16:39:46 http: TLS handshake error from 10.244.0.1:52244: EOF 2018/08/02 16:39:56 http: TLS handshake error from 10.244.0.1:52304: EOF 2018/08/02 16:40:06 http: TLS handshake error from 10.244.0.1:52364: EOF 2018/08/02 16:40:16 http: TLS handshake error from 10.244.0.1:52424: EOF 2018/08/02 16:40:26 http: TLS handshake error from 10.244.0.1:52484: EOF 2018/08/02 16:40:36 http: TLS handshake error from 10.244.0.1:52544: EOF 2018/08/02 16:40:46 http: TLS handshake error from 10.244.0.1:52604: EOF 2018/08/02 16:40:56 http: TLS handshake error from 10.244.0.1:52664: EOF 2018/08/02 16:41:06 http: TLS handshake error from 10.244.0.1:52724: EOF 2018/08/02 16:41:16 http: TLS handshake error from 10.244.0.1:52784: EOF 2018/08/02 16:41:26 http: TLS handshake error from 10.244.0.1:52844: EOF 2018/08/02 16:41:36 http: TLS handshake error from 10.244.0.1:52904: EOF 2018/08/02 16:41:46 http: TLS handshake error from 10.244.0.1:52964: EOF Pod name: virt-api-bcc6b587d-wc67b Pod phase: Running level=info timestamp=2018-08-02T16:40:57.786067Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:41:07 http: TLS handshake error from 10.244.1.1:50922: EOF level=info timestamp=2018-08-02T16:41:12.609670Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:41:13.128096Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:41:16.635732Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:41:17 http: TLS handshake error from 10.244.1.1:50928: EOF level=info timestamp=2018-08-02T16:41:20.256876Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:41:27 http: TLS handshake error from 10.244.1.1:50934: EOF level=info timestamp=2018-08-02T16:41:27.925759Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:41:37 http: TLS handshake error from 10.244.1.1:50940: EOF level=info timestamp=2018-08-02T16:41:42.670094Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:41:43.197801Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:41:46.673259Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:41:47 http: TLS handshake error from 10.244.1.1:50946: EOF level=info timestamp=2018-08-02T16:41:50.364957Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-8bbvq Pod phase: Running level=info timestamp=2018-08-02T16:34:11.031831Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-08-02T16:34:11.031887Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-08-02T16:34:11.031910Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-08-02T16:34:11.031927Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer limitrangeInformer" level=info timestamp=2018-08-02T16:34:11.031944Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-08-02T16:34:11.031961Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-08-02T16:34:11.031977Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-08-02T16:34:11.032037Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-08-02T16:34:11.032084Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-02T16:34:11.046148Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T16:34:11.046331Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T16:34:11.046395Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T16:34:11.046454Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T16:35:50.633047Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:35:50.636120Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-grg2b Pod phase: Running level=info timestamp=2018-08-02T16:34:12.150101Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-7tpdp Pod phase: Running level=info timestamp=2018-08-02T16:38:52.243784Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T16:38:52.243904Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:38:52.244384Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T16:38:52.261510Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T16:38:52.262195Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi2hqgc" level=info timestamp=2018-08-02T16:41:36.103716Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmi2hqgc, existing: true\n" level=info timestamp=2018-08-02T16:41:36.104263Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T16:41:36.104407Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:41:36.104780Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="No update processing required" level=info timestamp=2018-08-02T16:41:36.155508Z pos=vm.go:439 component=virt-handler namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T16:41:36.155819Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmi2hqgc, existing: true\n" level=info timestamp=2018-08-02T16:41:36.155874Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T16:41:36.155949Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:41:36.156067Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="No update processing required" level=info timestamp=2018-08-02T16:41:36.156176Z pos=vm.go:439 component=virt-handler namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-mr9b9 Pod phase: Running level=info timestamp=2018-08-02T16:34:13.916236Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-02T16:34:13.931174Z pos=vm.go:211 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-02T16:34:13.933143Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" Pod name: virt-launcher-testvmi2hqgc-886tm Pod phase: Failed level=error timestamp=2018-08-02T16:36:28.843395Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:36:49.340688Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:36:49.340965Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:37:30.318401Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:37:30.319064Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:38:52.259470Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:38:52.260324Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1444740, 0xc4201f9040, 0xc420204380, 0xc421a942a0) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 ------------------------------ • Failure [360.805 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 should succeed to start a vmi [It] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:133 Unexpected Warning event received: testvmi2hqgc,1dba426c-9672-11e8-93fb-525500d15501: server error. command Launcher.Sync failed: virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ') Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:246 ------------------------------ level=info timestamp=2018-08-02T16:35:51.197588Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmi2hqgc kind=VirtualMachineInstance uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi2hqgc-886tm" level=info timestamp=2018-08-02T16:36:08.291103Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmi2hqgc kind=VirtualMachineInstance uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi2hqgc-886tm" level=error timestamp=2018-08-02T16:36:08.540516Z pos=utils.go:242 component=tests namespace=kubevirt-test-default name=testvmi2hqgc kind=VirtualMachineInstance uid=1dba426c-9672-11e8-93fb-525500d15501 reason="unexpected warning event received" msg="server error. command Launcher.Sync failed: virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" Pod name: disks-images-provider-4dk74 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-h89ml Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-vm5c8 Pod phase: Running 2018/08/02 16:39:46 http: TLS handshake error from 10.244.0.1:52244: EOF 2018/08/02 16:39:56 http: TLS handshake error from 10.244.0.1:52304: EOF 2018/08/02 16:40:06 http: TLS handshake error from 10.244.0.1:52364: EOF 2018/08/02 16:40:16 http: TLS handshake error from 10.244.0.1:52424: EOF 2018/08/02 16:40:26 http: TLS handshake error from 10.244.0.1:52484: EOF 2018/08/02 16:40:36 http: TLS handshake error from 10.244.0.1:52544: EOF 2018/08/02 16:40:46 http: TLS handshake error from 10.244.0.1:52604: EOF 2018/08/02 16:40:56 http: TLS handshake error from 10.244.0.1:52664: EOF 2018/08/02 16:41:06 http: TLS handshake error from 10.244.0.1:52724: EOF 2018/08/02 16:41:16 http: TLS handshake error from 10.244.0.1:52784: EOF 2018/08/02 16:41:26 http: TLS handshake error from 10.244.0.1:52844: EOF 2018/08/02 16:41:36 http: TLS handshake error from 10.244.0.1:52904: EOF 2018/08/02 16:41:46 http: TLS handshake error from 10.244.0.1:52964: EOF 2018/08/02 16:41:56 http: TLS handshake error from 10.244.0.1:53024: EOF 2018/08/02 16:42:06 http: TLS handshake error from 10.244.0.1:53084: EOF Pod name: virt-api-bcc6b587d-wc67b Pod phase: Running level=info timestamp=2018-08-02T16:41:12.609670Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:41:13.128096Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:41:16.635732Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:41:17 http: TLS handshake error from 10.244.1.1:50928: EOF level=info timestamp=2018-08-02T16:41:20.256876Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:41:27 http: TLS handshake error from 10.244.1.1:50934: EOF level=info timestamp=2018-08-02T16:41:27.925759Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:41:37 http: TLS handshake error from 10.244.1.1:50940: EOF level=info timestamp=2018-08-02T16:41:42.670094Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:41:43.197801Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:41:46.673259Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:41:47 http: TLS handshake error from 10.244.1.1:50946: EOF level=info timestamp=2018-08-02T16:41:50.364957Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:41:57 http: TLS handshake error from 10.244.1.1:50952: EOF level=info timestamp=2018-08-02T16:41:58.051047Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-8bbvq Pod phase: Running level=info timestamp=2018-08-02T16:34:11.031910Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-08-02T16:34:11.031927Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer limitrangeInformer" level=info timestamp=2018-08-02T16:34:11.031944Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-08-02T16:34:11.031961Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-08-02T16:34:11.031977Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-08-02T16:34:11.032037Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-08-02T16:34:11.032084Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-02T16:34:11.046148Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T16:34:11.046331Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T16:34:11.046395Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T16:34:11.046454Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T16:35:50.633047Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:35:50.636120Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:41:51.640349Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:41:51.642852Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-grg2b Pod phase: Running level=info timestamp=2018-08-02T16:34:12.150101Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-7tpdp Pod phase: Running level=info timestamp=2018-08-02T16:42:06.421954Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T16:42:06.426728Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T16:42:06.427229Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmicc4mf" level=info timestamp=2018-08-02T16:42:06.468121Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmicc4mf, existing: true\n" level=info timestamp=2018-08-02T16:42:06.468268Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T16:42:06.468336Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:42:06.468535Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T16:42:06.476231Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T16:42:06.476571Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmicc4mf" level=info timestamp=2018-08-02T16:42:06.557191Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmicc4mf, existing: true\n" level=info timestamp=2018-08-02T16:42:06.557318Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T16:42:06.557385Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:42:06.557732Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T16:42:06.561968Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T16:42:06.562382Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmicc4mf" Pod name: virt-handler-mr9b9 Pod phase: Running level=info timestamp=2018-08-02T16:34:13.916236Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-02T16:34:13.931174Z pos=vm.go:211 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-02T16:34:13.933143Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" Pod name: virt-launcher-testvmicc4mf-qfw8d Pod phase: Running level=info timestamp=2018-08-02T16:42:06.242368Z pos=dhcp.go:62 component=virt-launcher msg="Starting SingleClientDHCPServer" level=error timestamp=2018-08-02T16:42:06.321868Z pos=libvirt_helper.go:97 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" msg="Defining the VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:42:06.322284Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:42:06.394180Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:42:06.394346Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:42:06.399996Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:42:06.400165Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:42:06.426077Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:42:06.426232Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:42:06.475377Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:42:06.475537Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:42:06.561279Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:42:06.561430Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:42:06.727488Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:42:06.727700Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" Pod name: disks-images-provider-4dk74 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-h89ml Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-vm5c8 Pod phase: Running 2018/08/02 16:45:26 http: TLS handshake error from 10.244.0.1:54284: EOF 2018/08/02 16:45:36 http: TLS handshake error from 10.244.0.1:54344: EOF 2018/08/02 16:45:46 http: TLS handshake error from 10.244.0.1:54404: EOF 2018/08/02 16:45:56 http: TLS handshake error from 10.244.0.1:54464: EOF 2018/08/02 16:46:06 http: TLS handshake error from 10.244.0.1:54524: EOF 2018/08/02 16:46:16 http: TLS handshake error from 10.244.0.1:54584: EOF 2018/08/02 16:46:26 http: TLS handshake error from 10.244.0.1:54644: EOF 2018/08/02 16:46:36 http: TLS handshake error from 10.244.0.1:54704: EOF 2018/08/02 16:46:46 http: TLS handshake error from 10.244.0.1:54764: EOF 2018/08/02 16:46:56 http: TLS handshake error from 10.244.0.1:54824: EOF 2018/08/02 16:47:06 http: TLS handshake error from 10.244.0.1:54884: EOF 2018/08/02 16:47:16 http: TLS handshake error from 10.244.0.1:54944: EOF 2018/08/02 16:47:26 http: TLS handshake error from 10.244.0.1:55004: EOF 2018/08/02 16:47:36 http: TLS handshake error from 10.244.0.1:55064: EOF 2018/08/02 16:47:46 http: TLS handshake error from 10.244.0.1:55124: EOF Pod name: virt-api-bcc6b587d-wc67b Pod phase: Running level=info timestamp=2018-08-02T16:47:13.471761Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:47:13.993879Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:47:16.658310Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:47:17 http: TLS handshake error from 10.244.1.1:51144: EOF level=info timestamp=2018-08-02T16:47:21.322683Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:47:24.746949Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T16:47:24.750213Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:47:27 http: TLS handshake error from 10.244.1.1:51150: EOF level=info timestamp=2018-08-02T16:47:29.697335Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:47:37 http: TLS handshake error from 10.244.1.1:51156: EOF level=info timestamp=2018-08-02T16:47:43.552743Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:47:44.063490Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:47:46.746335Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:47:47 http: TLS handshake error from 10.244.1.1:51162: EOF level=info timestamp=2018-08-02T16:47:51.364280Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-8bbvq Pod phase: Running level=info timestamp=2018-08-02T16:34:11.031910Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-08-02T16:34:11.031927Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer limitrangeInformer" level=info timestamp=2018-08-02T16:34:11.031944Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-08-02T16:34:11.031961Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-08-02T16:34:11.031977Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-08-02T16:34:11.032037Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-08-02T16:34:11.032084Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-02T16:34:11.046148Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T16:34:11.046331Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T16:34:11.046395Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T16:34:11.046454Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T16:35:50.633047Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:35:50.636120Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:41:51.640349Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:41:51.642852Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-grg2b Pod phase: Running level=info timestamp=2018-08-02T16:34:12.150101Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-7tpdp Pod phase: Running level=info timestamp=2018-08-02T16:44:50.321915Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T16:44:50.322038Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:44:50.322433Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T16:44:50.343249Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T16:44:50.344003Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmicc4mf" level=info timestamp=2018-08-02T16:47:34.184871Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmicc4mf, existing: true\n" level=info timestamp=2018-08-02T16:47:34.185338Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T16:47:34.185414Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:47:34.185733Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="No update processing required" level=info timestamp=2018-08-02T16:47:34.238632Z pos=vm.go:439 component=virt-handler namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T16:47:34.238875Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmicc4mf, existing: true\n" level=info timestamp=2018-08-02T16:47:34.238966Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T16:47:34.239034Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:47:34.239180Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="No update processing required" level=info timestamp=2018-08-02T16:47:34.239322Z pos=vm.go:439 component=virt-handler namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-mr9b9 Pod phase: Running level=info timestamp=2018-08-02T16:34:13.916236Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-02T16:34:13.931174Z pos=vm.go:211 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-02T16:34:13.933143Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" Pod name: virt-launcher-testvmicc4mf-qfw8d Pod phase: Failed level=error timestamp=2018-08-02T16:42:26.931873Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:42:47.425182Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:42:47.425625Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:43:28.397953Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:43:28.398360Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:44:50.340258Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:44:50.341358Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1444740, 0xc42036c0b0, 0xc4202041c0, 0xc4206b3440) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 • Failure [360.841 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 should succeed to stop a running vmi [It] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:139 Unexpected Warning event received: testvmicc4mf,f4ec8e6b-9672-11e8-93fb-525500d15501: server error. command Launcher.Sync failed: virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ') Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:246 ------------------------------ STEP: Starting the vmi level=info timestamp=2018-08-02T16:41:51.963992Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmicc4mf kind=VirtualMachineInstance uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Created virtual machine pod virt-launcher-testvmicc4mf-qfw8d" level=info timestamp=2018-08-02T16:42:06.461583Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmicc4mf kind=VirtualMachineInstance uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmicc4mf-qfw8d" level=error timestamp=2018-08-02T16:42:06.628599Z pos=utils.go:242 component=tests namespace=kubevirt-test-default name=testvmicc4mf kind=VirtualMachineInstance uid=f4ec8e6b-9672-11e8-93fb-525500d15501 reason="unexpected warning event received" msg="server error. command Launcher.Sync failed: virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" Pod name: disks-images-provider-4dk74 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-h89ml Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-vm5c8 Pod phase: Running 2018/08/02 16:45:46 http: TLS handshake error from 10.244.0.1:54404: EOF 2018/08/02 16:45:56 http: TLS handshake error from 10.244.0.1:54464: EOF 2018/08/02 16:46:06 http: TLS handshake error from 10.244.0.1:54524: EOF 2018/08/02 16:46:16 http: TLS handshake error from 10.244.0.1:54584: EOF 2018/08/02 16:46:26 http: TLS handshake error from 10.244.0.1:54644: EOF 2018/08/02 16:46:36 http: TLS handshake error from 10.244.0.1:54704: EOF 2018/08/02 16:46:46 http: TLS handshake error from 10.244.0.1:54764: EOF 2018/08/02 16:46:56 http: TLS handshake error from 10.244.0.1:54824: EOF 2018/08/02 16:47:06 http: TLS handshake error from 10.244.0.1:54884: EOF 2018/08/02 16:47:16 http: TLS handshake error from 10.244.0.1:54944: EOF 2018/08/02 16:47:26 http: TLS handshake error from 10.244.0.1:55004: EOF 2018/08/02 16:47:36 http: TLS handshake error from 10.244.0.1:55064: EOF 2018/08/02 16:47:46 http: TLS handshake error from 10.244.0.1:55124: EOF 2018/08/02 16:47:56 http: TLS handshake error from 10.244.0.1:55184: EOF 2018/08/02 16:48:06 http: TLS handshake error from 10.244.0.1:55244: EOF Pod name: virt-api-bcc6b587d-wc67b Pod phase: Running level=info timestamp=2018-08-02T16:47:16.658310Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:47:17 http: TLS handshake error from 10.244.1.1:51144: EOF level=info timestamp=2018-08-02T16:47:21.322683Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:47:24.746949Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T16:47:24.750213Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:47:27 http: TLS handshake error from 10.244.1.1:51150: EOF level=info timestamp=2018-08-02T16:47:29.697335Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:47:37 http: TLS handshake error from 10.244.1.1:51156: EOF level=info timestamp=2018-08-02T16:47:43.552743Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:47:44.063490Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:47:46.746335Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:47:47 http: TLS handshake error from 10.244.1.1:51162: EOF level=info timestamp=2018-08-02T16:47:51.364280Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:47:57 http: TLS handshake error from 10.244.1.1:51168: EOF level=info timestamp=2018-08-02T16:47:59.837128Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-8bbvq Pod phase: Running level=info timestamp=2018-08-02T16:34:11.031944Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-08-02T16:34:11.031961Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-08-02T16:34:11.031977Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-08-02T16:34:11.032037Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-08-02T16:34:11.032084Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-02T16:34:11.046148Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T16:34:11.046331Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T16:34:11.046395Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T16:34:11.046454Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T16:35:50.633047Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:35:50.636120Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:41:51.640349Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:41:51.642852Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:47:52.409324Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:47:52.412313Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-grg2b Pod phase: Running level=info timestamp=2018-08-02T16:34:12.150101Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-7tpdp Pod phase: Running level=info timestamp=2018-08-02T16:48:07.313060Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T16:48:07.320980Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T16:48:07.321360Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmid28dm" level=info timestamp=2018-08-02T16:48:07.361857Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmid28dm, existing: true\n" level=info timestamp=2018-08-02T16:48:07.362013Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T16:48:07.362081Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:48:07.362265Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T16:48:07.369663Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T16:48:07.370055Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmid28dm" level=info timestamp=2018-08-02T16:48:07.450650Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmid28dm, existing: true\n" level=info timestamp=2018-08-02T16:48:07.450796Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T16:48:07.450866Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:48:07.451056Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T16:48:07.455742Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T16:48:07.456126Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmid28dm" Pod name: virt-handler-mr9b9 Pod phase: Running level=info timestamp=2018-08-02T16:34:13.916236Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-02T16:34:13.931174Z pos=vm.go:211 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-02T16:34:13.933143Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" Pod name: virt-launcher-testvmid28dm-f7cgn Pod phase: Running level=info timestamp=2018-08-02T16:48:07.173340Z pos=converter.go:751 component=virt-launcher msg="Found nameservers in /etc/resolv.conf: \n`\u0000\n" level=info timestamp=2018-08-02T16:48:07.173498Z pos=converter.go:752 component=virt-launcher msg="Found search domains in /etc/resolv.conf: kubevirt-test-default.svc.cluster.local svc.cluster.local cluster.local" level=info timestamp=2018-08-02T16:48:07.175342Z pos=dhcp.go:62 component=virt-launcher msg="Starting SingleClientDHCPServer" level=error timestamp=2018-08-02T16:48:07.262173Z pos=libvirt_helper.go:97 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" msg="Defining the VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:48:07.262484Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:48:07.300304Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:48:07.300519Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:48:07.307018Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:48:07.307176Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:48:07.320288Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:48:07.320468Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:48:07.368981Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:48:07.369134Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:48:07.455002Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:48:07.455208Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" Pod name: disks-images-provider-4dk74 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-h89ml Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-vm5c8 Pod phase: Running 2018/08/02 16:51:26 http: TLS handshake error from 10.244.0.1:56444: EOF 2018/08/02 16:51:36 http: TLS handshake error from 10.244.0.1:56504: EOF 2018/08/02 16:51:46 http: TLS handshake error from 10.244.0.1:56564: EOF 2018/08/02 16:51:56 http: TLS handshake error from 10.244.0.1:56624: EOF 2018/08/02 16:52:06 http: TLS handshake error from 10.244.0.1:56684: EOF 2018/08/02 16:52:16 http: TLS handshake error from 10.244.0.1:56744: EOF 2018/08/02 16:52:26 http: TLS handshake error from 10.244.0.1:56804: EOF 2018/08/02 16:52:36 http: TLS handshake error from 10.244.0.1:56864: EOF 2018/08/02 16:52:46 http: TLS handshake error from 10.244.0.1:56924: EOF 2018/08/02 16:52:56 http: TLS handshake error from 10.244.0.1:56984: EOF 2018/08/02 16:53:06 http: TLS handshake error from 10.244.0.1:57044: EOF 2018/08/02 16:53:16 http: TLS handshake error from 10.244.0.1:57104: EOF 2018/08/02 16:53:26 http: TLS handshake error from 10.244.0.1:57164: EOF 2018/08/02 16:53:36 http: TLS handshake error from 10.244.0.1:57224: EOF 2018/08/02 16:53:46 http: TLS handshake error from 10.244.0.1:57284: EOF Pod name: virt-api-bcc6b587d-wc67b Pod phase: Running level=info timestamp=2018-08-02T16:53:01.347845Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:53:07 http: TLS handshake error from 10.244.1.1:51354: EOF level=info timestamp=2018-08-02T16:53:14.419540Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:53:14.849175Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:53:16.736015Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:53:17 http: TLS handshake error from 10.244.1.1:51360: EOF level=info timestamp=2018-08-02T16:53:22.257368Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:53:27 http: TLS handshake error from 10.244.1.1:51366: EOF level=info timestamp=2018-08-02T16:53:31.520959Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:53:37 http: TLS handshake error from 10.244.1.1:51372: EOF level=info timestamp=2018-08-02T16:53:44.485912Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:53:44.911182Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:53:46.788785Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:53:47 http: TLS handshake error from 10.244.1.1:51378: EOF level=info timestamp=2018-08-02T16:53:52.352099Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-8bbvq Pod phase: Running level=info timestamp=2018-08-02T16:34:11.031944Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-08-02T16:34:11.031961Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-08-02T16:34:11.031977Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-08-02T16:34:11.032037Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-08-02T16:34:11.032084Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-02T16:34:11.046148Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T16:34:11.046331Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T16:34:11.046395Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T16:34:11.046454Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T16:35:50.633047Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:35:50.636120Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:41:51.640349Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:41:51.642852Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:47:52.409324Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:47:52.412313Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-grg2b Pod phase: Running level=info timestamp=2018-08-02T16:34:12.150101Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-7tpdp Pod phase: Running level=info timestamp=2018-08-02T16:50:51.224420Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T16:50:51.224530Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:50:51.224957Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T16:50:51.242333Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T16:50:51.242997Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmid28dm" level=info timestamp=2018-08-02T16:53:35.084443Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmid28dm, existing: true\n" level=info timestamp=2018-08-02T16:53:35.085080Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T16:53:35.085171Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:53:35.085551Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="No update processing required" level=info timestamp=2018-08-02T16:53:35.114881Z pos=vm.go:439 component=virt-handler namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T16:53:35.120314Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmid28dm, existing: true\n" level=info timestamp=2018-08-02T16:53:35.120445Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T16:53:35.120516Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:53:35.120899Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="No update processing required" level=info timestamp=2018-08-02T16:53:35.121070Z pos=vm.go:439 component=virt-handler namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-mr9b9 Pod phase: Running level=info timestamp=2018-08-02T16:34:13.916236Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-02T16:34:13.931174Z pos=vm.go:211 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-02T16:34:13.933143Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" Pod name: virt-launcher-testvmid28dm-f7cgn Pod phase: Failed level=error timestamp=2018-08-02T16:48:27.831050Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:48:48.327186Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:48:48.327614Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:49:29.300056Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:49:29.300470Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:50:51.240113Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:50:51.241020Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1444740, 0xc4202ee3b0, 0xc4200dc380, 0xc420556ae0) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 • Failure in Spec Setup (BeforeEach) [360.833 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 with winrm connection /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:150 should have correct UUID [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:192 Unexpected Warning event received: testvmid28dm,cbf0cbf1-9673-11e8-93fb-525500d15501: server error. command Launcher.Sync failed: virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ') Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:246 ------------------------------ STEP: Creating winrm-cli pod for the future use STEP: Starting the windows VirtualMachineInstance level=info timestamp=2018-08-02T16:47:52.795735Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmid28dm kind=VirtualMachineInstance uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Created virtual machine pod virt-launcher-testvmid28dm-f7cgn" level=info timestamp=2018-08-02T16:48:07.378607Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmid28dm kind=VirtualMachineInstance uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmid28dm-f7cgn" level=error timestamp=2018-08-02T16:48:07.535327Z pos=utils.go:242 component=tests namespace=kubevirt-test-default name=testvmid28dm kind=VirtualMachineInstance uid=cbf0cbf1-9673-11e8-93fb-525500d15501 reason="unexpected warning event received" msg="server error. command Launcher.Sync failed: virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" Pod name: disks-images-provider-4dk74 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-h89ml Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-vm5c8 Pod phase: Running 2018/08/02 16:51:46 http: TLS handshake error from 10.244.0.1:56564: EOF 2018/08/02 16:51:56 http: TLS handshake error from 10.244.0.1:56624: EOF 2018/08/02 16:52:06 http: TLS handshake error from 10.244.0.1:56684: EOF 2018/08/02 16:52:16 http: TLS handshake error from 10.244.0.1:56744: EOF 2018/08/02 16:52:26 http: TLS handshake error from 10.244.0.1:56804: EOF 2018/08/02 16:52:36 http: TLS handshake error from 10.244.0.1:56864: EOF 2018/08/02 16:52:46 http: TLS handshake error from 10.244.0.1:56924: EOF 2018/08/02 16:52:56 http: TLS handshake error from 10.244.0.1:56984: EOF 2018/08/02 16:53:06 http: TLS handshake error from 10.244.0.1:57044: EOF 2018/08/02 16:53:16 http: TLS handshake error from 10.244.0.1:57104: EOF 2018/08/02 16:53:26 http: TLS handshake error from 10.244.0.1:57164: EOF 2018/08/02 16:53:36 http: TLS handshake error from 10.244.0.1:57224: EOF 2018/08/02 16:53:46 http: TLS handshake error from 10.244.0.1:57284: EOF 2018/08/02 16:53:56 http: TLS handshake error from 10.244.0.1:57344: EOF 2018/08/02 16:54:06 http: TLS handshake error from 10.244.0.1:57404: EOF Pod name: virt-api-bcc6b587d-wc67b Pod phase: Running level=info timestamp=2018-08-02T16:53:14.849175Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:53:16.736015Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:53:17 http: TLS handshake error from 10.244.1.1:51360: EOF level=info timestamp=2018-08-02T16:53:22.257368Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:53:27 http: TLS handshake error from 10.244.1.1:51366: EOF level=info timestamp=2018-08-02T16:53:31.520959Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:53:37 http: TLS handshake error from 10.244.1.1:51372: EOF level=info timestamp=2018-08-02T16:53:44.485912Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:53:44.911182Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:53:46.788785Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:53:47 http: TLS handshake error from 10.244.1.1:51378: EOF level=info timestamp=2018-08-02T16:53:52.352099Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:53:57 http: TLS handshake error from 10.244.1.1:51384: EOF level=info timestamp=2018-08-02T16:54:01.613304Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:54:07 http: TLS handshake error from 10.244.1.1:51390: EOF Pod name: virt-controller-67dcdd8464-8bbvq Pod phase: Running level=info timestamp=2018-08-02T16:34:11.031977Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-08-02T16:34:11.032037Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-08-02T16:34:11.032084Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-02T16:34:11.046148Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T16:34:11.046331Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T16:34:11.046395Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T16:34:11.046454Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T16:35:50.633047Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:35:50.636120Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:41:51.640349Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:41:51.642852Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:47:52.409324Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:47:52.412313Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:53:53.333837Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:53:53.336946Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-grg2b Pod phase: Running level=info timestamp=2018-08-02T16:34:12.150101Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-7tpdp Pod phase: Running level=info timestamp=2018-08-02T16:54:08.614525Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T16:54:08.619816Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T16:54:08.620195Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivvmgj" level=info timestamp=2018-08-02T16:54:08.660761Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmivvmgj, existing: true\n" level=info timestamp=2018-08-02T16:54:08.660935Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T16:54:08.661002Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:54:08.661235Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T16:54:08.665936Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T16:54:08.666344Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivvmgj" level=info timestamp=2018-08-02T16:54:08.746841Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmivvmgj, existing: true\n" level=info timestamp=2018-08-02T16:54:08.746967Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T16:54:08.747079Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:54:08.747294Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T16:54:08.753540Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T16:54:08.753904Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivvmgj" Pod name: virt-handler-mr9b9 Pod phase: Running level=info timestamp=2018-08-02T16:34:13.916236Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-02T16:34:13.931174Z pos=vm.go:211 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-02T16:34:13.933143Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" Pod name: virt-launcher-testvmivvmgj-z5vbr Pod phase: Running level=info timestamp=2018-08-02T16:54:08.448036Z pos=dhcp.go:62 component=virt-launcher msg="Starting SingleClientDHCPServer" level=error timestamp=2018-08-02T16:54:08.530645Z pos=libvirt_helper.go:97 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" msg="Defining the VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:54:08.530914Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:54:08.602398Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:54:08.602563Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:54:08.608915Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:54:08.609092Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:54:08.619141Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:54:08.619283Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:54:08.665239Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:54:08.665403Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:54:08.752853Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:54:08.753051Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:54:08.918800Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:54:08.918962Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" Pod name: disks-images-provider-4dk74 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-h89ml Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-vm5c8 Pod phase: Running 2018/08/02 16:57:26 http: TLS handshake error from 10.244.0.1:58604: EOF 2018/08/02 16:57:36 http: TLS handshake error from 10.244.0.1:58664: EOF 2018/08/02 16:57:46 http: TLS handshake error from 10.244.0.1:58724: EOF 2018/08/02 16:57:56 http: TLS handshake error from 10.244.0.1:58784: EOF 2018/08/02 16:58:06 http: TLS handshake error from 10.244.0.1:58844: EOF 2018/08/02 16:58:16 http: TLS handshake error from 10.244.0.1:58904: EOF 2018/08/02 16:58:26 http: TLS handshake error from 10.244.0.1:58964: EOF 2018/08/02 16:58:36 http: TLS handshake error from 10.244.0.1:59024: EOF 2018/08/02 16:58:46 http: TLS handshake error from 10.244.0.1:59084: EOF 2018/08/02 16:58:56 http: TLS handshake error from 10.244.0.1:59144: EOF 2018/08/02 16:59:06 http: TLS handshake error from 10.244.0.1:59204: EOF 2018/08/02 16:59:16 http: TLS handshake error from 10.244.0.1:59264: EOF 2018/08/02 16:59:26 http: TLS handshake error from 10.244.0.1:59324: EOF 2018/08/02 16:59:36 http: TLS handshake error from 10.244.0.1:59384: EOF 2018/08/02 16:59:46 http: TLS handshake error from 10.244.0.1:59444: EOF Pod name: virt-api-bcc6b587d-wc67b Pod phase: Running level=info timestamp=2018-08-02T16:59:16.530706Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T16:59:16.548474Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T16:59:16.771104Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:59:17 http: TLS handshake error from 10.244.1.1:51576: EOF level=info timestamp=2018-08-02T16:59:23.325799Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:59:24.803402Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T16:59:24.806843Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:59:27 http: TLS handshake error from 10.244.1.1:51582: EOF level=info timestamp=2018-08-02T16:59:33.117536Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:59:37 http: TLS handshake error from 10.244.1.1:51588: EOF level=info timestamp=2018-08-02T16:59:45.398092Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:59:45.781419Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:59:46.631241Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:59:47 http: TLS handshake error from 10.244.1.1:51594: EOF level=info timestamp=2018-08-02T16:59:53.408037Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-8bbvq Pod phase: Running level=info timestamp=2018-08-02T16:34:11.031977Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-08-02T16:34:11.032037Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-08-02T16:34:11.032084Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-02T16:34:11.046148Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T16:34:11.046331Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T16:34:11.046395Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T16:34:11.046454Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T16:35:50.633047Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:35:50.636120Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:41:51.640349Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:41:51.642852Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:47:52.409324Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:47:52.412313Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:53:53.333837Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:53:53.336946Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-grg2b Pod phase: Running level=info timestamp=2018-08-02T16:34:12.150101Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-7tpdp Pod phase: Running level=info timestamp=2018-08-02T16:56:52.527461Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T16:56:52.527547Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:56:52.528189Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T16:56:52.544861Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T16:56:52.545468Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivvmgj" level=info timestamp=2018-08-02T16:59:36.387261Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmivvmgj, existing: true\n" level=info timestamp=2018-08-02T16:59:36.388178Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T16:59:36.388254Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:59:36.388695Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="No update processing required" level=info timestamp=2018-08-02T16:59:36.439968Z pos=vm.go:439 component=virt-handler namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T16:59:36.446476Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmivvmgj, existing: true\n" level=info timestamp=2018-08-02T16:59:36.446666Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T16:59:36.446740Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T16:59:36.446933Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="No update processing required" level=info timestamp=2018-08-02T16:59:36.447111Z pos=vm.go:439 component=virt-handler namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-mr9b9 Pod phase: Running level=info timestamp=2018-08-02T16:34:13.916236Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-02T16:34:13.931174Z pos=vm.go:211 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-02T16:34:13.933143Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" Pod name: virt-launcher-testvmivvmgj-z5vbr Pod phase: Failed level=error timestamp=2018-08-02T16:54:29.129740Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:54:49.626773Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:54:49.628536Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:55:30.602690Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:55:30.603248Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T16:56:52.542829Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T16:56:52.543420Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1444740, 0xc4201fa440, 0xc4201f84d0, 0xc420092f60) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 • Failure in Spec Setup (BeforeEach) [360.896 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 with winrm connection /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:150 should have pod IP [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:208 Unexpected Warning event received: testvmivvmgj,a30b987a-9674-11e8-93fb-525500d15501: server error. command Launcher.Sync failed: virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ') Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:246 ------------------------------ STEP: Creating winrm-cli pod for the future use STEP: Starting the windows VirtualMachineInstance level=info timestamp=2018-08-02T16:53:53.728721Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmivvmgj kind=VirtualMachineInstance uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Created virtual machine pod virt-launcher-testvmivvmgj-z5vbr" level=info timestamp=2018-08-02T16:54:08.623941Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmivvmgj kind=VirtualMachineInstance uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmivvmgj-z5vbr" level=error timestamp=2018-08-02T16:54:08.812222Z pos=utils.go:242 component=tests namespace=kubevirt-test-default name=testvmivvmgj kind=VirtualMachineInstance uid=a30b987a-9674-11e8-93fb-525500d15501 reason="unexpected warning event received" msg="server error. command Launcher.Sync failed: virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" Pod name: disks-images-provider-4dk74 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-h89ml Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-vm5c8 Pod phase: Running 2018/08/02 16:57:46 http: TLS handshake error from 10.244.0.1:58724: EOF 2018/08/02 16:57:56 http: TLS handshake error from 10.244.0.1:58784: EOF 2018/08/02 16:58:06 http: TLS handshake error from 10.244.0.1:58844: EOF 2018/08/02 16:58:16 http: TLS handshake error from 10.244.0.1:58904: EOF 2018/08/02 16:58:26 http: TLS handshake error from 10.244.0.1:58964: EOF 2018/08/02 16:58:36 http: TLS handshake error from 10.244.0.1:59024: EOF 2018/08/02 16:58:46 http: TLS handshake error from 10.244.0.1:59084: EOF 2018/08/02 16:58:56 http: TLS handshake error from 10.244.0.1:59144: EOF 2018/08/02 16:59:06 http: TLS handshake error from 10.244.0.1:59204: EOF 2018/08/02 16:59:16 http: TLS handshake error from 10.244.0.1:59264: EOF 2018/08/02 16:59:26 http: TLS handshake error from 10.244.0.1:59324: EOF 2018/08/02 16:59:36 http: TLS handshake error from 10.244.0.1:59384: EOF 2018/08/02 16:59:46 http: TLS handshake error from 10.244.0.1:59444: EOF 2018/08/02 16:59:56 http: TLS handshake error from 10.244.0.1:59504: EOF 2018/08/02 17:00:06 http: TLS handshake error from 10.244.0.1:59564: EOF Pod name: virt-api-bcc6b587d-wc67b Pod phase: Running level=info timestamp=2018-08-02T16:59:23.325799Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:59:24.803402Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T16:59:24.806843Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:59:27 http: TLS handshake error from 10.244.1.1:51582: EOF level=info timestamp=2018-08-02T16:59:33.117536Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:59:37 http: TLS handshake error from 10.244.1.1:51588: EOF level=info timestamp=2018-08-02T16:59:45.398092Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:59:45.781419Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:59:46.631241Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 16:59:47 http: TLS handshake error from 10.244.1.1:51594: EOF level=info timestamp=2018-08-02T16:59:53.408037Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T16:59:55.129737Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 16:59:57 http: TLS handshake error from 10.244.1.1:51600: EOF level=info timestamp=2018-08-02T17:00:03.239186Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 17:00:07 http: TLS handshake error from 10.244.1.1:51606: EOF Pod name: virt-controller-67dcdd8464-8bbvq Pod phase: Running level=info timestamp=2018-08-02T16:34:11.046148Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T16:34:11.046331Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T16:34:11.046395Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T16:34:11.046454Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T16:35:50.633047Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:35:50.636120Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:41:51.640349Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:41:51.642852Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:47:52.409324Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:47:52.412313Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:53:53.333837Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:53:53.336946Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:59:55.183162Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:59:55.185143Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:59:55.463810Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi27djk\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi27djk" Pod name: virt-controller-67dcdd8464-grg2b Pod phase: Running level=info timestamp=2018-08-02T16:34:12.150101Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-7tpdp Pod phase: Running level=info timestamp=2018-08-02T17:00:11.152848Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T17:00:11.157565Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T17:00:11.157942Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi27djk" level=info timestamp=2018-08-02T17:00:11.198401Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmi27djk, existing: true\n" level=info timestamp=2018-08-02T17:00:11.198564Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T17:00:11.198709Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T17:00:11.198912Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T17:00:11.209486Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T17:00:11.209944Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi27djk" level=info timestamp=2018-08-02T17:00:11.290559Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmi27djk, existing: true\n" level=info timestamp=2018-08-02T17:00:11.290775Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T17:00:11.290846Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T17:00:11.291066Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T17:00:11.304809Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T17:00:11.305239Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi27djk" Pod name: virt-handler-mr9b9 Pod phase: Running level=info timestamp=2018-08-02T16:34:13.916236Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-02T16:34:13.931174Z pos=vm.go:211 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-02T16:34:13.933143Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" Pod name: virt-launcher-testvmi27djk-nr4h5 Pod phase: Running level=info timestamp=2018-08-02T17:00:10.928966Z pos=dhcp.go:62 component=virt-launcher msg="Starting SingleClientDHCPServer" level=error timestamp=2018-08-02T17:00:11.018661Z pos=libvirt_helper.go:97 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" msg="Defining the VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:00:11.018997Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:00:11.140764Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:00:11.140960Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:00:11.147327Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:00:11.147547Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:00:11.156935Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:00:11.157095Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:00:11.203322Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:00:11.203484Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:00:11.303973Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:00:11.304171Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:00:11.471896Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:00:11.472081Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" Pod name: disks-images-provider-4dk74 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-h89ml Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-vm5c8 Pod phase: Running 2018/08/02 17:01:26 http: TLS handshake error from 10.244.0.1:60044: EOF 2018/08/02 17:01:36 http: TLS handshake error from 10.244.0.1:60104: EOF 2018/08/02 17:01:46 http: TLS handshake error from 10.244.0.1:60164: EOF 2018/08/02 17:01:56 http: TLS handshake error from 10.244.0.1:60224: EOF 2018/08/02 17:02:06 http: TLS handshake error from 10.244.0.1:60284: EOF 2018/08/02 17:02:16 http: TLS handshake error from 10.244.0.1:60344: EOF 2018/08/02 17:02:26 http: TLS handshake error from 10.244.0.1:60404: EOF 2018/08/02 17:02:36 http: TLS handshake error from 10.244.0.1:60464: EOF 2018/08/02 17:02:46 http: TLS handshake error from 10.244.0.1:60524: EOF 2018/08/02 17:02:56 http: TLS handshake error from 10.244.0.1:60584: EOF 2018/08/02 17:03:06 http: TLS handshake error from 10.244.0.1:60644: EOF 2018/08/02 17:03:16 http: TLS handshake error from 10.244.0.1:60704: EOF 2018/08/02 17:03:26 http: TLS handshake error from 10.244.0.1:60764: EOF 2018/08/02 17:03:36 http: TLS handshake error from 10.244.0.1:60824: EOF 2018/08/02 17:03:46 http: TLS handshake error from 10.244.0.1:60884: EOF Pod name: virt-api-bcc6b587d-wc67b Pod phase: Running level=info timestamp=2018-08-02T17:03:04.056467Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 17:03:07 http: TLS handshake error from 10.244.1.1:51714: EOF level=info timestamp=2018-08-02T17:03:15.891104Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T17:03:16.249709Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T17:03:16.734149Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 17:03:17 http: TLS handshake error from 10.244.1.1:51720: EOF level=info timestamp=2018-08-02T17:03:24.054887Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 17:03:27 http: TLS handshake error from 10.244.1.1:51726: EOF level=info timestamp=2018-08-02T17:03:34.206965Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 17:03:37 http: TLS handshake error from 10.244.1.1:51732: EOF level=info timestamp=2018-08-02T17:03:45.976858Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T17:03:46.309566Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T17:03:46.687528Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 17:03:47 http: TLS handshake error from 10.244.1.1:51738: EOF level=info timestamp=2018-08-02T17:03:54.143871Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-8bbvq Pod phase: Running level=info timestamp=2018-08-02T16:34:11.046148Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T16:34:11.046331Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T16:34:11.046395Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T16:34:11.046454Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T16:35:50.633047Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:35:50.636120Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:41:51.640349Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:41:51.642852Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:47:52.409324Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:47:52.412313Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:53:53.333837Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:53:53.336946Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:59:55.183162Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:59:55.185143Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:59:55.463810Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi27djk\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi27djk" Pod name: virt-controller-67dcdd8464-grg2b Pod phase: Running level=info timestamp=2018-08-02T16:34:12.150101Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-7tpdp Pod phase: Running level=info timestamp=2018-08-02T17:00:52.159017Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T17:00:52.171156Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T17:00:52.171829Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi27djk" level=info timestamp=2018-08-02T17:01:33.132959Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmi27djk, existing: true\n" level=info timestamp=2018-08-02T17:01:33.133822Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T17:01:33.133966Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T17:01:33.134567Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T17:01:33.158503Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T17:01:33.159427Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi27djk" level=info timestamp=2018-08-02T17:02:55.080827Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmi27djk, existing: true\n" level=info timestamp=2018-08-02T17:02:55.081623Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T17:02:55.081739Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T17:02:55.082142Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T17:02:55.100134Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T17:02:55.100803Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi27djk" Pod name: virt-handler-mr9b9 Pod phase: Running level=info timestamp=2018-08-02T16:34:13.916236Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-02T16:34:13.931174Z pos=vm.go:211 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-02T16:34:13.933143Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" Pod name: virt-launcher-testvmi27djk-nr4h5 Pod phase: Running level=error timestamp=2018-08-02T17:00:12.446133Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:00:13.733958Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:00:13.734158Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:00:16.300248Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:00:16.300462Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:00:21.427289Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:00:21.427463Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:00:31.675433Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:00:31.675699Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:00:52.169741Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:00:52.170183Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:01:33.155120Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:01:33.156252Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:02:55.098231Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:02:55.098868Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" • Failure [241.900 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 with kubectl command /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:226 should succeed to start a vmi [It] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:242 Unexpected Warning event received: testvmi27djk,7ac241cc-9675-11e8-93fb-525500d15501: server error. command Launcher.Sync failed: virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ') Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:246 ------------------------------ STEP: Starting the vmi via kubectl command level=info timestamp=2018-08-02T16:59:55.639453Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmi27djk kind=VirtualMachineInstance uid=7ac241cc-9675-11e8-93fb-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi27djk-nr4h5" level=info timestamp=2018-08-02T17:00:11.080681Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmi27djk kind=VirtualMachineInstance uid=7ac241cc-9675-11e8-93fb-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi27djk-nr4h5" level=error timestamp=2018-08-02T17:00:11.306743Z pos=utils.go:242 component=tests namespace=kubevirt-test-default name=testvmi27djk kind=VirtualMachineInstance uid=7ac241cc-9675-11e8-93fb-525500d15501 reason="unexpected warning event received" msg="server error. command Launcher.Sync failed: virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" Pod name: disks-images-provider-4dk74 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-h89ml Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-vm5c8 Pod phase: Running 2018/08/02 17:01:46 http: TLS handshake error from 10.244.0.1:60164: EOF 2018/08/02 17:01:56 http: TLS handshake error from 10.244.0.1:60224: EOF 2018/08/02 17:02:06 http: TLS handshake error from 10.244.0.1:60284: EOF 2018/08/02 17:02:16 http: TLS handshake error from 10.244.0.1:60344: EOF 2018/08/02 17:02:26 http: TLS handshake error from 10.244.0.1:60404: EOF 2018/08/02 17:02:36 http: TLS handshake error from 10.244.0.1:60464: EOF 2018/08/02 17:02:46 http: TLS handshake error from 10.244.0.1:60524: EOF 2018/08/02 17:02:56 http: TLS handshake error from 10.244.0.1:60584: EOF 2018/08/02 17:03:06 http: TLS handshake error from 10.244.0.1:60644: EOF 2018/08/02 17:03:16 http: TLS handshake error from 10.244.0.1:60704: EOF 2018/08/02 17:03:26 http: TLS handshake error from 10.244.0.1:60764: EOF 2018/08/02 17:03:36 http: TLS handshake error from 10.244.0.1:60824: EOF 2018/08/02 17:03:46 http: TLS handshake error from 10.244.0.1:60884: EOF 2018/08/02 17:03:56 http: TLS handshake error from 10.244.0.1:60944: EOF 2018/08/02 17:04:06 http: TLS handshake error from 10.244.0.1:32772: EOF Pod name: virt-api-bcc6b587d-wc67b Pod phase: Running level=info timestamp=2018-08-02T17:03:16.734149Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 17:03:17 http: TLS handshake error from 10.244.1.1:51720: EOF level=info timestamp=2018-08-02T17:03:24.054887Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 17:03:27 http: TLS handshake error from 10.244.1.1:51726: EOF level=info timestamp=2018-08-02T17:03:34.206965Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 17:03:37 http: TLS handshake error from 10.244.1.1:51732: EOF level=info timestamp=2018-08-02T17:03:45.976858Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T17:03:46.309566Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T17:03:46.687528Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 17:03:47 http: TLS handshake error from 10.244.1.1:51738: EOF level=info timestamp=2018-08-02T17:03:54.143871Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T17:03:56.663382Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 17:03:57 http: TLS handshake error from 10.244.1.1:51744: EOF level=info timestamp=2018-08-02T17:04:04.346992Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 17:04:07 http: TLS handshake error from 10.244.1.1:51750: EOF Pod name: virt-controller-67dcdd8464-8bbvq Pod phase: Running level=info timestamp=2018-08-02T16:34:11.046395Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T16:34:11.046454Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T16:35:50.633047Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:35:50.636120Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:41:51.640349Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:41:51.642852Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:47:52.409324Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:47:52.412313Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:53:53.333837Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:53:53.336946Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:59:55.183162Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:59:55.185143Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:59:55.463810Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi27djk\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi27djk" level=info timestamp=2018-08-02T17:03:56.719554Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T17:03:56.724322Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-grg2b Pod phase: Running level=info timestamp=2018-08-02T16:34:12.150101Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-7tpdp Pod phase: Running level=info timestamp=2018-08-02T17:04:12.784138Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T17:04:12.788323Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T17:04:12.788722Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmirkf7p" level=info timestamp=2018-08-02T17:04:12.829171Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmirkf7p, existing: true\n" level=info timestamp=2018-08-02T17:04:12.829345Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T17:04:12.829412Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T17:04:12.829698Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T17:04:12.834347Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T17:04:12.834719Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmirkf7p" level=info timestamp=2018-08-02T17:04:12.915172Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmirkf7p, existing: true\n" level=info timestamp=2018-08-02T17:04:12.915348Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T17:04:12.915423Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T17:04:12.915745Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T17:04:12.920292Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T17:04:12.920748Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmirkf7p" Pod name: virt-handler-mr9b9 Pod phase: Running level=info timestamp=2018-08-02T16:34:13.916236Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-02T16:34:13.931174Z pos=vm.go:211 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-02T16:34:13.933143Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" Pod name: virt-launcher-testvmirkf7p-sgzrv Pod phase: Running level=info timestamp=2018-08-02T17:04:12.617328Z pos=dhcp.go:62 component=virt-launcher msg="Starting SingleClientDHCPServer" level=error timestamp=2018-08-02T17:04:12.707722Z pos=libvirt_helper.go:97 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" msg="Defining the VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:04:12.708051Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:04:12.771698Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:04:12.771918Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:04:12.779245Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:04:12.779441Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:04:12.787473Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:04:12.787683Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:04:12.833717Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:04:12.833899Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:04:12.919569Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:04:12.919815Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:04:13.085374Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:04:13.085552Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" Pod name: disks-images-provider-4dk74 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-h89ml Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-vm5c8 Pod phase: Running 2018/08/02 17:05:36 http: TLS handshake error from 10.244.0.1:33312: EOF 2018/08/02 17:05:46 http: TLS handshake error from 10.244.0.1:33372: EOF 2018/08/02 17:05:56 http: TLS handshake error from 10.244.0.1:33432: EOF 2018/08/02 17:06:06 http: TLS handshake error from 10.244.0.1:33492: EOF 2018/08/02 17:06:16 http: TLS handshake error from 10.244.0.1:33552: EOF 2018/08/02 17:06:26 http: TLS handshake error from 10.244.0.1:33612: EOF 2018/08/02 17:06:36 http: TLS handshake error from 10.244.0.1:33672: EOF 2018/08/02 17:06:46 http: TLS handshake error from 10.244.0.1:33732: EOF 2018/08/02 17:06:56 http: TLS handshake error from 10.244.0.1:33792: EOF 2018/08/02 17:07:06 http: TLS handshake error from 10.244.0.1:33852: EOF 2018/08/02 17:07:16 http: TLS handshake error from 10.244.0.1:33912: EOF 2018/08/02 17:07:26 http: TLS handshake error from 10.244.0.1:33972: EOF 2018/08/02 17:07:36 http: TLS handshake error from 10.244.0.1:34032: EOF 2018/08/02 17:07:46 http: TLS handshake error from 10.244.0.1:34092: EOF 2018/08/02 17:07:56 http: TLS handshake error from 10.244.0.1:34152: EOF Pod name: virt-api-bcc6b587d-wc67b Pod phase: Running level=info timestamp=2018-08-02T17:07:16.453785Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T17:07:16.739978Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T17:07:16.904429Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 17:07:17 http: TLS handshake error from 10.244.1.1:51864: EOF level=info timestamp=2018-08-02T17:07:24.157368Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T17:07:24.161840Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T17:07:24.819404Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 17:07:27 http: TLS handshake error from 10.244.1.1:51870: EOF level=info timestamp=2018-08-02T17:07:35.421992Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 17:07:37 http: TLS handshake error from 10.244.1.1:51876: EOF level=info timestamp=2018-08-02T17:07:46.517896Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T17:07:46.752900Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T17:07:46.975990Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 17:07:47 http: TLS handshake error from 10.244.1.1:51882: EOF level=info timestamp=2018-08-02T17:07:54.907693Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-8bbvq Pod phase: Running level=info timestamp=2018-08-02T16:34:11.046395Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T16:34:11.046454Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T16:35:50.633047Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:35:50.636120Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2hqgc kind= uid=1dba426c-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:41:51.640349Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:41:51.642852Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicc4mf kind= uid=f4ec8e6b-9672-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:47:52.409324Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:47:52.412313Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid28dm kind= uid=cbf0cbf1-9673-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:53:53.333837Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:53:53.336946Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvmgj kind= uid=a30b987a-9674-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:59:55.183162Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T16:59:55.185143Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi27djk kind= uid=7ac241cc-9675-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T16:59:55.463810Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi27djk\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi27djk" level=info timestamp=2018-08-02T17:03:56.719554Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T17:03:56.724322Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-grg2b Pod phase: Running level=info timestamp=2018-08-02T16:34:12.150101Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-7tpdp Pod phase: Running level=info timestamp=2018-08-02T17:05:34.744073Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T17:05:34.744180Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T17:05:34.744728Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T17:05:34.761879Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T17:05:34.762450Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmirkf7p" level=info timestamp=2018-08-02T17:05:38.941947Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmi27djk, existing: false\n" level=info timestamp=2018-08-02T17:05:38.942136Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T17:05:38.942285Z pos=vm.go:412 component=virt-handler namespace=kubevirt-test-default name=testvmi27djk kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T17:05:38.942545Z pos=vm.go:439 component=virt-handler namespace=kubevirt-test-default name=testvmi27djk kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T17:06:56.683346Z pos=vm.go:312 component=virt-handler msg="Processing vmi testvmirkf7p, existing: true\n" level=info timestamp=2018-08-02T17:06:56.683939Z pos=vm.go:314 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T17:06:56.684053Z pos=vm.go:328 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T17:06:56.684386Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T17:06:56.699304Z pos=vm.go:423 component=virt-handler namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T17:06:56.699967Z pos=vm.go:250 component=virt-handler reason="server error. command Launcher.Sync failed: mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmirkf7p" Pod name: virt-handler-mr9b9 Pod phase: Running level=info timestamp=2018-08-02T16:34:13.916236Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-02T16:34:13.931174Z pos=vm.go:211 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-02T16:34:13.933143Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" Pod name: virt-launcher-testvmirkf7p-sgzrv Pod phase: Running level=error timestamp=2018-08-02T17:04:14.058313Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:04:15.345347Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:04:15.345528Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:04:17.912857Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:04:17.913038Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:04:23.038976Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:04:23.039169Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:04:33.286861Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:04:33.287128Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:04:53.780085Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:04:53.780423Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:05:34.759474Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:05:34.760482Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" level=error timestamp=2018-08-02T17:06:56.697662Z pos=manager.go:151 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="pre start setup for VirtualMachineInstance failed." level=error timestamp=2018-08-02T17:06:56.698273Z pos=server.go:68 component=virt-launcher namespace=kubevirt-test-default name=testvmirkf7p kind= uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="mkdir /var/run/libvirt/kubevirt-ephemeral-disk/disk-data/windows-disk: file exists" msg="Failed to sync vmi" • Failure [241.582 seconds] Windows VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:57 with kubectl command /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:226 should succeed to stop a vmi [It] /root/go/src/kubevirt.io/kubevirt/tests/windows_test.go:250 Unexpected Warning event received: testvmirkf7p,0ab994ac-9676-11e8-93fb-525500d15501: server error. command Launcher.Sync failed: virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ') Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:246 ------------------------------ STEP: Starting the vmi via kubectl command level=info timestamp=2018-08-02T17:03:57.182518Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmirkf7p kind=VirtualMachineInstance uid=0ab994ac-9676-11e8-93fb-525500d15501 msg="Created virtual machine pod virt-launcher-testvmirkf7p-sgzrv" level=info timestamp=2018-08-02T17:04:12.730834Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmirkf7p kind=VirtualMachineInstance uid=0ab994ac-9676-11e8-93fb-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmirkf7p-sgzrv" level=error timestamp=2018-08-02T17:04:12.959329Z pos=utils.go:242 component=tests namespace=kubevirt-test-default name=testvmirkf7p kind=VirtualMachineInstance uid=0ab994ac-9676-11e8-93fb-525500d15501 reason="unexpected warning event received" msg="server error. command Launcher.Sync failed: virError(Code=8, Domain=44, Message='invalid argument: could not find capabilities for arch=x86_64 domaintype=kvm ')" SSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSS Waiting for namespace kubevirt-test-default to be removed, this can take a while ... Waiting for namespace kubevirt-test-alternative to be removed, this can take a while ... Summarizing 6 Failures: [Fail] Windows VirtualMachineInstance [It] should succeed to start a vmi /root/go/src/kubevirt.io/kubevirt/tests/utils.go:246 [Fail] Windows VirtualMachineInstance [It] should succeed to stop a running vmi /root/go/src/kubevirt.io/kubevirt/tests/utils.go:246 [Fail] Windows VirtualMachineInstance with winrm connection [BeforeEach] should have correct UUID /root/go/src/kubevirt.io/kubevirt/tests/utils.go:246 [Fail] Windows VirtualMachineInstance with winrm connection [BeforeEach] should have pod IP /root/go/src/kubevirt.io/kubevirt/tests/utils.go:246 [Fail] Windows VirtualMachineInstance with kubectl command [It] should succeed to start a vmi /root/go/src/kubevirt.io/kubevirt/tests/utils.go:246 [Fail] Windows VirtualMachineInstance with kubectl command [It] should succeed to stop a vmi /root/go/src/kubevirt.io/kubevirt/tests/utils.go:246 Ran 6 of 148 Specs in 1951.524 seconds FAIL! -- 0 Passed | 6 Failed | 0 Pending | 142 Skipped --- FAIL: TestTests (1951.54s) FAIL make: *** [functest] Error 1 + make cluster-down ./cluster/down.sh