+ export WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev + WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev + [[ k8s-1.11.0-dev =~ openshift-.* ]] + [[ k8s-1.11.0-dev =~ .*-1.10.4-.* ]] + export KUBEVIRT_PROVIDER=k8s-1.11.0 + KUBEVIRT_PROVIDER=k8s-1.11.0 + export KUBEVIRT_NUM_NODES=2 + KUBEVIRT_NUM_NODES=2 + export NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + export NAMESPACE=kube-system + NAMESPACE=kube-system + trap '{ make cluster-down; }' EXIT SIGINT SIGTERM SIGSTOP + make cluster-down ./cluster/down.sh + make cluster-up ./cluster/up.sh Downloading ....... Downloading ....... 2018/08/02 09:26:54 Waiting for host: 192.168.66.101:22 2018/08/02 09:26:57 Problem with dial: dial tcp 192.168.66.101:22: getsockopt: no route to host. Sleeping 5s 2018/08/02 09:27:05 Problem with dial: dial tcp 192.168.66.101:22: getsockopt: no route to host. Sleeping 5s 2018/08/02 09:27:10 Connected to tcp://192.168.66.101:22 ++ systemctl status docker ++ wc -l ++ grep active + [[ 0 -eq 0 ]] + sleep 2 ++ systemctl status docker ++ wc -l ++ grep active + [[ 1 -eq 0 ]] + kubeadm init --config /etc/kubernetes/kubeadm.conf [init] using Kubernetes version: v1.11.0 [preflight] running pre-flight checks I0802 09:27:13.352916 1263 feature_gate.go:230] feature gates: &{map[]} I0802 09:27:13.430294 1263 kernel_validator.go:81] Validating kernel version I0802 09:27:13.430477 1263 kernel_validator.go:96] Validating kernel config [preflight/images] Pulling images required for setting up a Kubernetes cluster [preflight/images] This might take a minute or two, depending on the speed of your internet connection [preflight/images] You can also perform this action in beforehand using 'kubeadm config images pull' [kubelet] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env" [kubelet] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml" [preflight] Activating the kubelet service [certificates] Generated ca certificate and key. [certificates] Generated apiserver certificate and key. [certificates] apiserver serving cert is signed for DNS names [node01 kubernetes kubernetes.default kubernetes.default.svc kubernetes.default.svc.cluster.local] and IPs [10.96.0.1 192.168.66.101] [certificates] Generated apiserver-kubelet-client certificate and key. [certificates] Generated sa key and public key. [certificates] Generated front-proxy-ca certificate and key. [certificates] Generated front-proxy-client certificate and key. [certificates] Generated etcd/ca certificate and key. [certificates] Generated etcd/server certificate and key. [certificates] etcd/server serving cert is signed for DNS names [node01 localhost] and IPs [127.0.0.1 ::1] [certificates] Generated etcd/peer certificate and key. [certificates] etcd/peer serving cert is signed for DNS names [node01 localhost] and IPs [192.168.66.101 127.0.0.1 ::1] [certificates] Generated etcd/healthcheck-client certificate and key. [certificates] Generated apiserver-etcd-client certificate and key. [certificates] valid certificates and keys now exist in "/etc/kubernetes/pki" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/admin.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/kubelet.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/controller-manager.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/scheduler.conf" [controlplane] wrote Static Pod manifest for component kube-apiserver to "/etc/kubernetes/manifests/kube-apiserver.yaml" [controlplane] wrote Static Pod manifest for component kube-controller-manager to "/etc/kubernetes/manifests/kube-controller-manager.yaml" [controlplane] wrote Static Pod manifest for component kube-scheduler to "/etc/kubernetes/manifests/kube-scheduler.yaml" [etcd] Wrote Static Pod manifest for a local etcd instance to "/etc/kubernetes/manifests/etcd.yaml" [init] waiting for the kubelet to boot up the control plane as Static Pods from directory "/etc/kubernetes/manifests" [init] this might take a minute or longer if the control plane images have to be pulled [apiclient] All control plane components are healthy after 46.505557 seconds [uploadconfig] storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace [kubelet] Creating a ConfigMap "kubelet-config-1.11" in namespace kube-system with the configuration for the kubelets in the cluster [markmaster] Marking the node node01 as master by adding the label "node-role.kubernetes.io/master=''" [markmaster] Marking the node node01 as master by adding the taints [node-role.kubernetes.io/master:NoSchedule] [patchnode] Uploading the CRI Socket information "/var/run/dockershim.sock" to the Node API object "node01" as an annotation [bootstraptoken] using token: abcdef.1234567890123456 [bootstraptoken] configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials [bootstraptoken] configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token [bootstraptoken] configured RBAC rules to allow certificate rotation for all node client certificates in the cluster [bootstraptoken] creating the "cluster-info" ConfigMap in the "kube-public" namespace [addons] Applied essential addon: CoreDNS [addons] Applied essential addon: kube-proxy Your Kubernetes master has initialized successfully! To start using your cluster, you need to run the following as a regular user: mkdir -p $HOME/.kube sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config sudo chown $(id -u):$(id -g) $HOME/.kube/config You should now deploy a pod network to the cluster. Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at: https://kubernetes.io/docs/concepts/cluster-administration/addons/ You can now join any number of machines by running the following on each node as root: kubeadm join 192.168.66.101:6443 --token abcdef.1234567890123456 --discovery-token-ca-cert-hash sha256:92ad6ae6df399e8844e857ea688a5162ca8b1f0219864500843b3f3567b3bdc0 + kubectl --kubeconfig=/etc/kubernetes/admin.conf apply -f https://raw.githubusercontent.com/coreos/flannel/v0.9.1/Documentation/kube-flannel.yml clusterrole.rbac.authorization.k8s.io/flannel created clusterrolebinding.rbac.authorization.k8s.io/flannel created serviceaccount/flannel created configmap/kube-flannel-cfg created daemonset.extensions/kube-flannel-ds created + kubectl --kubeconfig=/etc/kubernetes/admin.conf taint nodes node01 node-role.kubernetes.io/master:NoSchedule- node/node01 untainted + kubectl --kubeconfig=/etc/kubernetes/admin.conf create -f /tmp/local-volume.yaml storageclass.storage.k8s.io/local created configmap/local-storage-config created clusterrolebinding.rbac.authorization.k8s.io/local-storage-provisioner-pv-binding created clusterrole.rbac.authorization.k8s.io/local-storage-provisioner-node-clusterrole created clusterrolebinding.rbac.authorization.k8s.io/local-storage-provisioner-node-binding created role.rbac.authorization.k8s.io/local-storage-provisioner-jobs-role created rolebinding.rbac.authorization.k8s.io/local-storage-provisioner-jobs-rolebinding created serviceaccount/local-storage-admin created daemonset.extensions/local-volume-provisioner created 2018/08/02 09:28:16 Waiting for host: 192.168.66.102:22 2018/08/02 09:28:19 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: no route to host. Sleeping 5s 2018/08/02 09:28:27 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: no route to host. Sleeping 5s 2018/08/02 09:28:32 Connected to tcp://192.168.66.102:22 ++ wc -l ++ grep active ++ systemctl status docker + [[ 0 -eq 0 ]] + sleep 2 ++ systemctl status docker ++ grep active ++ wc -l + [[ 1 -eq 0 ]] + kubeadm join --token abcdef.1234567890123456 192.168.66.101:6443 --ignore-preflight-errors=all --discovery-token-unsafe-skip-ca-verification=true [preflight] running pre-flight checks [WARNING RequiredIPVSKernelModulesAvailable]: the IPVS proxier will not be used, because the following required kernel modules are not loaded: [ip_vs_rr ip_vs_wrr ip_vs_sh ip_vs] or no builtin kernel ipvs support: map[ip_vs:{} ip_vs_rr:{} ip_vs_wrr:{} ip_vs_sh:{} nf_conntrack_ipv4:{}] you can solve this problem with following methods: 1. Run 'modprobe -- ' to load missing kernel modules; 2. Provide the missing builtin kernel ipvs support I0802 09:28:35.118770 1271 kernel_validator.go:81] Validating kernel version I0802 09:28:35.118993 1271 kernel_validator.go:96] Validating kernel config [discovery] Trying to connect to API Server "192.168.66.101:6443" [discovery] Created cluster-info discovery client, requesting info from "https://192.168.66.101:6443" [discovery] Cluster info signature and contents are valid and no TLS pinning was specified, will use API Server "192.168.66.101:6443" [discovery] Successfully established connection with API Server "192.168.66.101:6443" [kubelet] Downloading configuration for the kubelet from the "kubelet-config-1.11" ConfigMap in the kube-system namespace [kubelet] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml" [kubelet] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env" [preflight] Activating the kubelet service [tlsbootstrap] Waiting for the kubelet to perform the TLS Bootstrap... [patchnode] Uploading the CRI Socket information "/var/run/dockershim.sock" to the Node API object "node02" as an annotation This node has joined the cluster: * Certificate signing request was sent to master and a response was received. * The Kubelet was informed of the new secure connection details. Run 'kubectl get nodes' on the master to see this node join the cluster. Sending file modes: C0755 38739968 kubectl Sending file modes: C0600 5450 admin.conf Cluster "kubernetes" set. Cluster "kubernetes" set. + set +e + kubectl get nodes --no-headers + cluster/kubectl.sh get nodes --no-headers node01 Ready master 42s v1.11.0 node02 Ready 16s v1.11.0 + kubectl_rc=0 + '[' 0 -ne 0 ']' ++ kubectl get nodes --no-headers ++ cluster/kubectl.sh get nodes --no-headers ++ grep NotReady + '[' -n '' ']' + set -e + echo 'Nodes are ready:' Nodes are ready: + kubectl get nodes + cluster/kubectl.sh get nodes NAME STATUS ROLES AGE VERSION node01 Ready master 42s v1.11.0 node02 Ready 16s v1.11.0 + make cluster-sync ./cluster/build.sh Building ... Untagged: localhost:32920/kubevirt/virt-controller:devel Untagged: localhost:32920/kubevirt/virt-controller@sha256:4c0260bd265bc352d8833ea0392bd90351c3eed09784f11e54027cf669471703 Deleted: sha256:65c138786c2c4f75fd47becbdbdcf2401709b519da64a686949f14b7d1794f6e Deleted: sha256:946f14a85ea7e93ed1bbaa93289fd29476c778a57df8b3f794e8abffc8d46869 Deleted: sha256:6c7949837fb50099b7bd813b3d22e765699d67da606875118d9e7f9d477da315 Deleted: sha256:4321b29ec15d82d8e190be1d02e2621a6ea73e0ff4c4b821548045eb432571ae Untagged: localhost:32920/kubevirt/virt-launcher:devel Untagged: localhost:32920/kubevirt/virt-launcher@sha256:041a2fbf573db05fdab7259e0c15ff5ed32ca79d2ab23667cb2abd408422737b Deleted: sha256:7d82eff426248ef2d1f54ea69cb3b3a23b17a3a4287200d578c53e305b36a842 Deleted: sha256:69e1e4826767dc3528351f29180532246babf1a99d461502b3d252d288bccbae Deleted: sha256:4fc6a31de5804e60e0f49a005d97721fc37e62e66042ba00afe82921fb23ae9d Deleted: sha256:7b61efa00cb37887bca595adae4c60ee3976081853f1f37378ab5fb45d69e2f7 Deleted: sha256:2df129bd2a237534b4906393a77c0726fd5448b57ef5ad29ba6803ebfd2930a6 Deleted: sha256:caf3529ef0e6e1f2a241be07ac6c9c59ab262e31e1886fc82fb6f0953f566af4 Deleted: sha256:32c54db15a2da939e551316d0ae7128810f4ba6907a1d4bd54478601a4926e71 Deleted: sha256:ccefa074f99a5bdfc6ea21c301eae80cb6aba6a3aaa57bbcdede31a00be7a487 Deleted: sha256:f31b48fb2c814983490ea46d5af9f652fdbff0a62f31b7153023b2b6dfca1afd Deleted: sha256:45333c526bf8f29fca3adeeae8ed1c6bbed5c705b3cad7aee263b9afefb0ad9a Deleted: sha256:5a29cc0d6f1d910751eac895eaef66153c005846e0cff3482f48074847a695ad Deleted: sha256:16ba9655a410223da001f87cfe11d4a2614963bbf7fbe9c4cf734b9323eb062d Untagged: localhost:32920/kubevirt/virt-handler:devel Untagged: localhost:32920/kubevirt/virt-handler@sha256:70570eb34d485c75da26402631199bab7ea082b054f562c9b08731360c9a63cf Deleted: sha256:b379f91bc684f60df1ff9c49a570f15b620979143bbca2113fcbac13edf01f64 Deleted: sha256:e0057934154a89339c17d4212d17b8a404e3a85bfba09c9acd41d86aa7f400fa Deleted: sha256:d8ffa11ceeeb3de1799391b12813bf9e379799eaecf9cc051506ffbca4230e8f Deleted: sha256:7574e0eb6a0cda18343eb552f8d4ff390f43ceeebb330b13a3ec9e10fd5333d8 Untagged: localhost:32920/kubevirt/virt-api:devel Untagged: localhost:32920/kubevirt/virt-api@sha256:d1a11706c5db708afd8b56c8053e69e6020486dcafdd4bbce526a28ae2551856 Deleted: sha256:2ac269b95d4e1c3a0dc75701b9c6ced689ba722a3a901a05227aea44b9411a5e Deleted: sha256:22d0c536393db9452ecadf7ef165c427c38a7f5474dccc0717076fb769c9840a Deleted: sha256:968bd4e5411205339b8f4e52c7d23d1521a9ecbed64bf79535f930ef0de63e2b Deleted: sha256:53398bfef1053b65665a5776aaad1fbc1cd8952fade8e3dacd79fdadf260f124 Untagged: localhost:32920/kubevirt/subresource-access-test:devel Untagged: localhost:32920/kubevirt/subresource-access-test@sha256:1d15fbcb67052c9e3acf9c3be9a124641ab028992b1c61911a9f741d9d2951c8 Deleted: sha256:0ffb9d53dd1914b71e59aefa4fd7a9a89bb13a93658bf18df17471e1ca3519e9 Deleted: sha256:d32ff543152288cf04b3a1b220db6701c97f3564f70df97c55f95fb9b0a33196 Deleted: sha256:9a1eb9c55c2b0a29ec54d0aef9d5cd47122be7cd6c64609cdb819d3d7ba8346f Deleted: sha256:67919c463b483677826de01285a3ffeda579bdc6532e86f48ba55c9b1a87c826 Untagged: localhost:32920/kubevirt/example-hook-sidecar:devel Untagged: localhost:32920/kubevirt/example-hook-sidecar@sha256:29ef047b3f591f5d7c1a89bd26955f4bd64d8a7d3c5429c632dcb740936fc144 Deleted: sha256:096b6dc907bde50b7554db5e8d90c5e7a3dbc53cae39339a6bc4dba4823f8088 Deleted: sha256:e0b7fa27be60b567d7c5448514c6e5a8c4485fbf96ebfa3e24ec8d91fd955a82 Deleted: sha256:c1b32204580e033cdc64f8de92e27217bee2491f0c99139faf0e07824dca5ada Deleted: sha256:75f0c09fc0276bc6945ea5a4a376c1b389c44b3f6dd24f6dfa58c96c0c372c5e sha256:ceba12cbc33e4e37a707840478a630db561e2427b78c8c9f9cd6d0b73276ab32 go version go1.10 linux/amd64 Waiting for rsyncd to be ready go version go1.10 linux/amd64 make[1]: Entering directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt' hack/dockerized "./hack/check.sh && KUBEVIRT_VERSION= ./hack/build-go.sh install " && ./hack/build-copy-artifacts.sh sha256:ceba12cbc33e4e37a707840478a630db561e2427b78c8c9f9cd6d0b73276ab32 go version go1.10 linux/amd64 go version go1.10 linux/amd64 find: '/root/go/src/kubevirt.io/kubevirt/_out/cmd': No such file or directory Compiling tests... compiled tests.test hack/build-docker.sh build Sending build context to Docker daemon 40.39 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-controller ---> Using cache ---> b4f3251c6468 Step 4/8 : WORKDIR /home/virt-controller ---> Using cache ---> 813752072d9d Step 5/8 : USER 1001 ---> Using cache ---> 88b3556f36b4 Step 6/8 : COPY virt-controller /usr/bin/virt-controller ---> 82d4f1f5cbaf Removing intermediate container 2807e635b3d3 Step 7/8 : ENTRYPOINT /usr/bin/virt-controller ---> Running in f028d5545c7d ---> 7662aed59b46 Removing intermediate container f028d5545c7d Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.11.0-dev0" '' "virt-controller" '' ---> Running in 325a8659ce3a ---> da42265b6dde Removing intermediate container 325a8659ce3a Successfully built da42265b6dde Sending build context to Docker daemon 43.31 MB Step 1/10 : FROM kubevirt/libvirt:4.2.0 ---> 5f0bfe81a3e0 Step 2/10 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 09010a005182 Step 3/10 : RUN dnf -y install socat genisoimage util-linux libcgroup-tools ethtool net-tools sudo && dnf -y clean all && test $(id -u qemu) = 107 # make sure that the qemu user really is 107 ---> Using cache ---> fc9481693838 Step 4/10 : COPY virt-launcher /usr/bin/virt-launcher ---> 67946c3c76dc Removing intermediate container 68c9a1d80090 Step 5/10 : COPY kubevirt-sudo /etc/sudoers.d/kubevirt ---> 8a0c966fc09d Removing intermediate container 39e48f78bada Step 6/10 : RUN setcap CAP_NET_BIND_SERVICE=+eip /usr/bin/qemu-system-x86_64 ---> Running in ad5bb2551698  ---> a20a6376356a Removing intermediate container ad5bb2551698 Step 7/10 : RUN mkdir -p /usr/share/kubevirt/virt-launcher ---> Running in 658bc083b4ce  ---> 82ae33f2805e Removing intermediate container 658bc083b4ce Step 8/10 : COPY entrypoint.sh libvirtd.sh sock-connector /usr/share/kubevirt/virt-launcher/ ---> 311dada997e6 Removing intermediate container 6d8d25893d9e Step 9/10 : ENTRYPOINT /usr/share/kubevirt/virt-launcher/entrypoint.sh ---> Running in cc5754145cdf ---> be5e7377de69 Removing intermediate container cc5754145cdf Step 10/10 : LABEL "kubevirt-functional-tests-k8s-1.11.0-dev0" '' "virt-launcher" '' ---> Running in d1d1ac73022f ---> 03a83fccc179 Removing intermediate container d1d1ac73022f Successfully built 03a83fccc179 Sending build context to Docker daemon 41.74 MB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/5 : COPY virt-handler /usr/bin/virt-handler ---> 164fd8d09bc5 Removing intermediate container 389920010d49 Step 4/5 : ENTRYPOINT /usr/bin/virt-handler ---> Running in 7b674a703c04 ---> 3a850e1e8303 Removing intermediate container 7b674a703c04 Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.11.0-dev0" '' "virt-handler" '' ---> Running in 979ee17cfb6c ---> 47bb922854ac Removing intermediate container 979ee17cfb6c Successfully built 47bb922854ac Sending build context to Docker daemon 38.81 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-api ---> Using cache ---> 3cff23506e80 Step 4/8 : WORKDIR /home/virt-api ---> Using cache ---> e94c5606b96b Step 5/8 : USER 1001 ---> Using cache ---> af16317199f5 Step 6/8 : COPY virt-api /usr/bin/virt-api ---> 34a7bf508b03 Removing intermediate container 9987d159f166 Step 7/8 : ENTRYPOINT /usr/bin/virt-api ---> Running in 8a25ddd33edb ---> 666cbbe8707a Removing intermediate container 8a25ddd33edb Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.11.0-dev0" '' "virt-api" '' ---> Running in dc7225e5fae5 ---> 74d18d1e7343 Removing intermediate container dc7225e5fae5 Successfully built 74d18d1e7343 Sending build context to Docker daemon 4.096 kB Step 1/7 : FROM fedora:28 ---> cc510acfcd70 Step 2/7 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/7 : ENV container docker ---> Using cache ---> aed3ca4ac3a3 Step 4/7 : RUN mkdir -p /images/custom /images/alpine && truncate -s 64M /images/custom/disk.img && curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /images/alpine/disk.img ---> Using cache ---> c7d0cf8fc982 Step 5/7 : ADD entrypoint.sh / ---> Using cache ---> 0393e5ee0c37 Step 6/7 : CMD /entrypoint.sh ---> Using cache ---> 23798f49dea3 Step 7/7 : LABEL "disks-images-provider" '' "kubevirt-functional-tests-k8s-1.11.0-dev0" '' ---> Using cache ---> 1af0950b148a Successfully built 1af0950b148a Sending build context to Docker daemon 2.56 kB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/5 : ENV container docker ---> Using cache ---> aed3ca4ac3a3 Step 4/5 : RUN dnf -y install procps-ng nmap-ncat && dnf -y clean all ---> Using cache ---> d8c990eaf575 Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.11.0-dev0" '' "vm-killer" '' ---> Using cache ---> d12886a382f5 Successfully built d12886a382f5 Sending build context to Docker daemon 5.12 kB Step 1/7 : FROM debian:sid ---> 68f33cf86aab Step 2/7 : MAINTAINER "David Vossel" \ ---> Using cache ---> 50fc79ebe51c Step 3/7 : ENV container docker ---> Using cache ---> b8e063496923 Step 4/7 : RUN apt-get update && apt-get install -y bash curl bzip2 qemu-utils && mkdir -p /disk && rm -rf /var/lib/apt/lists/* ---> Using cache ---> 8adb1572b35c Step 5/7 : ADD entry-point.sh / ---> Using cache ---> 8c0c5a52e4df Step 6/7 : CMD /entry-point.sh ---> Using cache ---> 1a4b838e5dee Step 7/7 : LABEL "kubevirt-functional-tests-k8s-1.11.0-dev0" '' "registry-disk-v1alpha" '' ---> Using cache ---> 7463d4232d9d Successfully built 7463d4232d9d Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33262/kubevirt/registry-disk-v1alpha:devel ---> 7463d4232d9d Step 2/4 : MAINTAINER "David Vossel" \ ---> Using cache ---> 37dedfac54ed Step 3/4 : RUN curl https://download.cirros-cloud.net/0.4.0/cirros-0.4.0-x86_64-disk.img > /disk/cirros.img ---> Using cache ---> dfbc26aea8cb Step 4/4 : LABEL "cirros-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.11.0-dev0" '' ---> Using cache ---> 1fc0cb0eabf3 Successfully built 1fc0cb0eabf3 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33262/kubevirt/registry-disk-v1alpha:devel ---> 7463d4232d9d Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 9344ae6324bf Step 3/4 : RUN curl -g -L https://download.fedoraproject.org/pub/fedora/linux/releases/27/CloudImages/x86_64/images/Fedora-Cloud-Base-27-1.6.x86_64.qcow2 > /disk/fedora.qcow2 ---> Using cache ---> a0ae1c3d8273 Step 4/4 : LABEL "fedora-cloud-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.11.0-dev0" '' ---> Using cache ---> 9f71570ac984 Successfully built 9f71570ac984 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33262/kubevirt/registry-disk-v1alpha:devel ---> 7463d4232d9d Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 9344ae6324bf Step 3/4 : RUN curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /disk/alpine.iso ---> Using cache ---> 790d22e44882 Step 4/4 : LABEL "alpine-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.11.0-dev0" '' ---> Using cache ---> 01559c73c467 Successfully built 01559c73c467 Sending build context to Docker daemon 35.59 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virtctl ---> Using cache ---> d74088d7a4fc Step 4/8 : WORKDIR /home/virtctl ---> Using cache ---> c8c857bf8d96 Step 5/8 : USER 1001 ---> Using cache ---> 36730a67b946 Step 6/8 : COPY subresource-access-test /subresource-access-test ---> 9addfb3be0e8 Removing intermediate container 3b54e0fcebd1 Step 7/8 : ENTRYPOINT /subresource-access-test ---> Running in 2a2852561964 ---> 9eb3d8e4ff13 Removing intermediate container 2a2852561964 Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.11.0-dev0" '' "subresource-access-test" '' ---> Running in 32365c364457 ---> 35088e3db1ce Removing intermediate container 32365c364457 Successfully built 35088e3db1ce Sending build context to Docker daemon 3.072 kB Step 1/9 : FROM fedora:28 ---> cc510acfcd70 Step 2/9 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 84920e004a40 Step 3/9 : ENV container docker ---> Using cache ---> aed3ca4ac3a3 Step 4/9 : RUN dnf -y install make git gcc && dnf -y clean all ---> Using cache ---> 6050b24a5d85 Step 5/9 : ENV GIMME_GO_VERSION 1.9.2 ---> Using cache ---> 0447d2178073 Step 6/9 : RUN mkdir -p /gimme && curl -sL https://raw.githubusercontent.com/travis-ci/gimme/master/gimme | HOME=/gimme bash >> /etc/profile.d/gimme.sh ---> Using cache ---> 291db82d955f Step 7/9 : ENV GOPATH "/go" GOBIN "/usr/bin" ---> Using cache ---> 793556477837 Step 8/9 : RUN mkdir -p /go && source /etc/profile.d/gimme.sh && go get github.com/masterzen/winrm-cli ---> Using cache ---> fd5c6e1f9461 Step 9/9 : LABEL "kubevirt-functional-tests-k8s-1.11.0-dev0" '' "winrmcli" '' ---> Using cache ---> da11ebf4ba97 Successfully built da11ebf4ba97 Sending build context to Docker daemon 36.8 MB Step 1/5 : FROM fedora:27 ---> 9110ae7f579f Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 71a8c548e503 Step 3/5 : COPY example-hook-sidecar /example-hook-sidecar ---> 91d0c577ae13 Removing intermediate container 87c1d9fc1557 Step 4/5 : ENTRYPOINT /example-hook-sidecar ---> Running in 851da8ab7a39 ---> 094b61046b7d Removing intermediate container 851da8ab7a39 Step 5/5 : LABEL "example-hook-sidecar" '' "kubevirt-functional-tests-k8s-1.11.0-dev0" '' ---> Running in 826692fa224a ---> fd76e467634b Removing intermediate container 826692fa224a Successfully built fd76e467634b hack/build-docker.sh push The push refers to a repository [localhost:33262/kubevirt/virt-controller] 5663188984be: Preparing b2f5abdac324: Preparing 891e1e4ef82a: Preparing b2f5abdac324: Pushed 5663188984be: Pushed 891e1e4ef82a: Pushed devel: digest: sha256:82c544937a186ca2a9af324a17162ade4d21d082c5dc6d75ed2d8cc704e9083c size: 949 The push refers to a repository [localhost:33262/kubevirt/virt-launcher] e85c0953c671: Preparing 2520def5c816: Preparing 802c8b216b12: Preparing ff7f8dea83b4: Preparing f24c6ac9ae88: Preparing 0b99c4111657: Preparing da38cf808aa5: Preparing b83399358a92: Preparing 186d8b3e4fd8: Preparing fa6154170bf5: Preparing 5eefb9960a36: Preparing 891e1e4ef82a: Preparing 0b99c4111657: Waiting fa6154170bf5: Waiting 5eefb9960a36: Waiting da38cf808aa5: Waiting 891e1e4ef82a: Waiting b83399358a92: Waiting 186d8b3e4fd8: Waiting e85c0953c671: Pushed 2520def5c816: Pushed ff7f8dea83b4: Pushed da38cf808aa5: Pushed b83399358a92: Pushed 802c8b216b12: Pushed fa6154170bf5: Pushed 186d8b3e4fd8: Pushed 891e1e4ef82a: Mounted from kubevirt/virt-controller f24c6ac9ae88: Pushed 0b99c4111657: Pushed 5eefb9960a36: Pushed devel: digest: sha256:b485f572144f4d94553b177801be7d808d0a54ace702f268e3a11f7e53abfe75 size: 2828 The push refers to a repository [localhost:33262/kubevirt/virt-handler] 440e594c41af: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-launcher 440e594c41af: Pushed devel: digest: sha256:8b45ca448bd5ca952e5c125e796f8a75bb3ba9446aed3f9d397bae0e42e6a06c size: 741 The push refers to a repository [localhost:33262/kubevirt/virt-api] 42d90f5c3276: Preparing afd1d781e4d1: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-handler afd1d781e4d1: Pushed 42d90f5c3276: Pushed devel: digest: sha256:5eb161df8cc063d0b09ca7ef4bb85e29b195dad014b613f9f8744ab5291d5ec7 size: 948 The push refers to a repository [localhost:33262/kubevirt/disks-images-provider] dc0875c44573: Preparing 8fc77a44094f: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-api dc0875c44573: Pushed 8fc77a44094f: Pushed devel: digest: sha256:2b43f0dc4407be0e48ccc5a561ccf42478a295101d921399e0a317072f461f7f size: 948 The push refers to a repository [localhost:33262/kubevirt/vm-killer] d1b69e768421: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/disks-images-provider d1b69e768421: Pushed devel: digest: sha256:e8115e6add090b1a40956ec96e5d7216d60b0c798b08bf3e66a5b2357ceafb0f size: 740 The push refers to a repository [localhost:33262/kubevirt/registry-disk-v1alpha] 2a15632f54d4: Preparing 91a924e03d7c: Preparing 25edbec0eaea: Preparing 2a15632f54d4: Pushed 91a924e03d7c: Pushed 25edbec0eaea: Pushed devel: digest: sha256:8c7387bda36b477019753aef422fcef2ecaa496610b511f18b5e778676c68ed0 size: 948 The push refers to a repository [localhost:33262/kubevirt/cirros-registry-disk-demo] c0e0ac6363a3: Preparing 2a15632f54d4: Preparing 91a924e03d7c: Preparing 25edbec0eaea: Preparing 2a15632f54d4: Mounted from kubevirt/registry-disk-v1alpha 25edbec0eaea: Mounted from kubevirt/registry-disk-v1alpha 91a924e03d7c: Mounted from kubevirt/registry-disk-v1alpha c0e0ac6363a3: Pushed devel: digest: sha256:44ba09ef8826276848727b58411a0eebf291b3ee28455387792805a123fe0e2e size: 1160 The push refers to a repository [localhost:33262/kubevirt/fedora-cloud-registry-disk-demo] b5fcbb50202d: Preparing 2a15632f54d4: Preparing 91a924e03d7c: Preparing 25edbec0eaea: Preparing 25edbec0eaea: Mounted from kubevirt/cirros-registry-disk-demo 91a924e03d7c: Mounted from kubevirt/cirros-registry-disk-demo 2a15632f54d4: Mounted from kubevirt/cirros-registry-disk-demo b5fcbb50202d: Pushed devel: digest: sha256:ba5317bae78e69e01bdb240c54b577bc4ea748665ad2d63bd5b4347202c28aa3 size: 1161 The push refers to a repository [localhost:33262/kubevirt/alpine-registry-disk-demo] 0348707bf8ed: Preparing 2a15632f54d4: Preparing 91a924e03d7c: Preparing 25edbec0eaea: Preparing 91a924e03d7c: Mounted from kubevirt/fedora-cloud-registry-disk-demo 2a15632f54d4: Mounted from kubevirt/fedora-cloud-registry-disk-demo 25edbec0eaea: Mounted from kubevirt/fedora-cloud-registry-disk-demo 0348707bf8ed: Pushed devel: digest: sha256:24abc7372689b392405bafd9825b6b2f4890d531c4814f69597f14be65ccec83 size: 1160 The push refers to a repository [localhost:33262/kubevirt/subresource-access-test] 13146b98d34a: Preparing 4052ce9d0aff: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/vm-killer 4052ce9d0aff: Pushed 13146b98d34a: Pushed devel: digest: sha256:6e046859305cef12072f93597809b18d43430a0fefe298f8fd1c997ed783044b size: 948 The push refers to a repository [localhost:33262/kubevirt/winrmcli] 64ccc7ac4271: Preparing 4242962b50c3: Preparing 0e374d8c733e: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/subresource-access-test 64ccc7ac4271: Pushed 0e374d8c733e: Pushed 4242962b50c3: Pushed devel: digest: sha256:a9de87c18b211d6a25b356243bda63ef5188e6a50b1be0e65d18dbd9993a045f size: 1165 The push refers to a repository [localhost:33262/kubevirt/example-hook-sidecar] b8c131c34889: Preparing 39bae602f753: Preparing b8c131c34889: Pushed 39bae602f753: Pushed devel: digest: sha256:36c6b56c9d736a532c1e24cf4f93276d6847eb1adfc6a56faa6488c322c1b0ba size: 740 make[1]: Leaving directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt' Done ./cluster/clean.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.11.0 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.11.0-dev ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.11.0-dev0 ++ job_prefix=kubevirt-functional-tests-k8s-1.11.0-dev0 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-180-gd2174f4 ++ KUBEVIRT_VERSION=v0.7.0-180-gd2174f4 + source cluster/k8s-1.11.0/provider.sh ++ set -e ++ image=k8s-1.11.0@sha256:6c1caf5559eb02a144bf606de37eb0194c06ace4d77ad4561459f3bde876151c ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.11.0 ++ source hack/config-default.sh source hack/config-k8s-1.11.0.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.11.0.sh ++ source hack/config-provider-k8s-1.11.0.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/cluster/k8s-1.11.0/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/cluster/k8s-1.11.0/.kubectl +++ docker_prefix=localhost:33262/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Cleaning up ...' Cleaning up ... + cluster/kubectl.sh get vmis --all-namespaces -o=custom-columns=NAME:.metadata.name,NAMESPACE:.metadata.namespace,FINALIZERS:.metadata.finalizers --no-headers + grep foregroundDeleteVirtualMachine + read p error: the server doesn't have a resource type "vmis" + _kubectl delete ds -l kubevirt.io -n kube-system --cascade=false --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=libvirt --force --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=virt-handler --force --grace-period 0 No resources found + namespaces=(default ${namespace}) + for i in '${namespaces[@]}' + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete deployment -l kubevirt.io No resources found + _kubectl -n default delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete rs -l kubevirt.io No resources found + _kubectl -n default delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete services -l kubevirt.io No resources found + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n default delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete secrets -l kubevirt.io No resources found + _kubectl -n default delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete pv -l kubevirt.io No resources found + _kubectl -n default delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete pvc -l kubevirt.io No resources found + _kubectl -n default delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete ds -l kubevirt.io No resources found + _kubectl -n default delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n default delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete pods -l kubevirt.io No resources found + _kubectl -n default delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n default delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete rolebinding -l kubevirt.io No resources found + _kubectl -n default delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete roles -l kubevirt.io No resources found + _kubectl -n default delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete clusterroles -l kubevirt.io No resources found + _kubectl -n default delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n default get crd offlinevirtualmachines.kubevirt.io ++ wc -l ++ export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig ++ cluster/k8s-1.11.0/.kubectl -n default get crd offlinevirtualmachines.kubevirt.io No resources found. Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + for i in '${namespaces[@]}' + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete deployment -l kubevirt.io No resources found + _kubectl -n kube-system delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete rs -l kubevirt.io No resources found + _kubectl -n kube-system delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete services -l kubevirt.io No resources found + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n kube-system delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete secrets -l kubevirt.io No resources found + _kubectl -n kube-system delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete pv -l kubevirt.io No resources found + _kubectl -n kube-system delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete pvc -l kubevirt.io No resources found + _kubectl -n kube-system delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete ds -l kubevirt.io No resources found + _kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n kube-system delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete pods -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete rolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete roles -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete clusterroles -l kubevirt.io No resources found + _kubectl -n kube-system delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io ++ export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig ++ cluster/k8s-1.11.0/.kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io ++ wc -l No resources found. Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + sleep 2 + echo Done Done ./cluster/deploy.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.11.0 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.11.0-dev ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.11.0-dev0 ++ job_prefix=kubevirt-functional-tests-k8s-1.11.0-dev0 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-180-gd2174f4 ++ KUBEVIRT_VERSION=v0.7.0-180-gd2174f4 + source cluster/k8s-1.11.0/provider.sh ++ set -e ++ image=k8s-1.11.0@sha256:6c1caf5559eb02a144bf606de37eb0194c06ace4d77ad4561459f3bde876151c ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.11.0 ++ source hack/config-default.sh source hack/config-k8s-1.11.0.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.11.0.sh ++ source hack/config-provider-k8s-1.11.0.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/cluster/k8s-1.11.0/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/cluster/k8s-1.11.0/.kubectl +++ docker_prefix=localhost:33262/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Deploying ...' Deploying ... + [[ -z k8s-1.11.0-dev ]] + [[ k8s-1.11.0-dev =~ .*-dev ]] + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/manifests/dev -R + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/manifests/dev -R serviceaccount/kubevirt-apiserver created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-apiserver created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-apiserver-auth-delegator created rolebinding.rbac.authorization.k8s.io/kubevirt-apiserver created role.rbac.authorization.k8s.io/kubevirt-apiserver created clusterrole.rbac.authorization.k8s.io/kubevirt-apiserver created clusterrole.rbac.authorization.k8s.io/kubevirt-controller created serviceaccount/kubevirt-controller created serviceaccount/kubevirt-privileged created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-controller created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-controller-cluster-admin created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-privileged-cluster-admin created clusterrole.rbac.authorization.k8s.io/kubevirt.io:admin created clusterrole.rbac.authorization.k8s.io/kubevirt.io:edit created clusterrole.rbac.authorization.k8s.io/kubevirt.io:view created clusterrole.rbac.authorization.k8s.io/kubevirt.io:default created clusterrolebinding.rbac.authorization.k8s.io/kubevirt.io:default created customresourcedefinition.apiextensions.k8s.io/virtualmachineinstancereplicasets.kubevirt.io created service/virt-api created deployment.extensions/virt-api created service/virt-controller created deployment.extensions/virt-controller created daemonset.extensions/virt-handler created customresourcedefinition.apiextensions.k8s.io/virtualmachines.kubevirt.io created customresourcedefinition.apiextensions.k8s.io/virtualmachineinstances.kubevirt.io created customresourcedefinition.apiextensions.k8s.io/virtualmachineinstancepresets.kubevirt.io created + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R persistentvolumeclaim/disk-alpine created persistentvolume/host-path-disk-alpine created persistentvolumeclaim/disk-custom created persistentvolume/host-path-disk-custom created daemonset.extensions/disks-images-provider created serviceaccount/kubevirt-testing created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-testing-cluster-admin created + [[ k8s-1.11.0 =~ os-* ]] + echo Done Done + namespaces=(kube-system default) + [[ kube-system != \k\u\b\e\-\s\y\s\t\e\m ]] + timeout=300 + sample=30 + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n kube-system --no-headers ++ cluster/kubectl.sh get pods -n kube-system --no-headers ++ grep -v Running + '[' -n 'virt-api-7d79975b94-9j92f 0/1 ContainerCreating 0 2s virt-controller-67dcdd8464-w2xzj 0/1 ContainerCreating 0 2s virt-controller-67dcdd8464-wvph9 0/1 ContainerCreating 0 2s virt-handler-92p79 0/1 ContainerCreating 0 2s virt-handler-sn5j5 0/1 ContainerCreating 0 2s' ']' + echo 'Waiting for kubevirt pods to enter the Running state ...' Waiting for kubevirt pods to enter the Running state ... + kubectl get pods -n kube-system --no-headers + grep -v Running + cluster/kubectl.sh get pods -n kube-system --no-headers virt-api-7d79975b94-9j92f 0/1 ContainerCreating 0 2s virt-controller-67dcdd8464-w2xzj 0/1 ContainerCreating 0 2s virt-controller-67dcdd8464-wvph9 0/1 ContainerCreating 0 2s virt-handler-92p79 0/1 ContainerCreating 0 2s virt-handler-sn5j5 0/1 ContainerCreating 0 2s + sleep 30 + current_time=30 + '[' 30 -gt 300 ']' ++ kubectl get pods -n kube-system --no-headers ++ cluster/kubectl.sh get pods -n kube-system --no-headers ++ grep -v Running + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false + '[' -n false ']' + echo 'Waiting for KubeVirt containers to become ready ...' Waiting for KubeVirt containers to become ready ... + kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + grep false + cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers false + sleep 30 + current_time=30 + '[' 30 -gt 300 ']' ++ grep false ++ kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n '' ']' + kubectl get pods -n kube-system + cluster/kubectl.sh get pods -n kube-system NAME READY STATUS RESTARTS AGE coredns-78fcdf6894-2mwwm 1/1 Running 0 8m coredns-78fcdf6894-cnvck 1/1 Running 0 8m disks-images-provider-bbjxx 1/1 Running 0 1m disks-images-provider-ssdb6 1/1 Running 0 1m etcd-node01 1/1 Running 0 7m kube-apiserver-node01 1/1 Running 0 7m kube-controller-manager-node01 1/1 Running 0 7m kube-flannel-ds-fmhjl 1/1 Running 0 8m kube-flannel-ds-npmnx 1/1 Running 0 8m kube-proxy-8kv4d 1/1 Running 0 8m kube-proxy-xpsmz 1/1 Running 0 8m kube-scheduler-node01 1/1 Running 0 7m virt-api-7d79975b94-9j92f 1/1 Running 0 1m virt-controller-67dcdd8464-w2xzj 1/1 Running 0 1m virt-controller-67dcdd8464-wvph9 1/1 Running 0 1m virt-handler-92p79 1/1 Running 0 1m virt-handler-sn5j5 1/1 Running 0 1m + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n default --no-headers ++ grep -v Running ++ cluster/kubectl.sh get pods -n default --no-headers + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n '' ']' + kubectl get pods -n default + cluster/kubectl.sh get pods -n default NAME READY STATUS RESTARTS AGE local-volume-provisioner-gxwq9 1/1 Running 0 8m local-volume-provisioner-jfbzm 1/1 Running 0 8m + kubectl version + cluster/kubectl.sh version Client Version: version.Info{Major:"1", Minor:"11", GitVersion:"v1.11.0", GitCommit:"91e7b4fd31fcd3d5f436da26c980becec37ceefe", GitTreeState:"clean", BuildDate:"2018-06-27T20:17:28Z", GoVersion:"go1.10.2", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"11", GitVersion:"v1.11.0", GitCommit:"91e7b4fd31fcd3d5f436da26c980becec37ceefe", GitTreeState:"clean", BuildDate:"2018-06-27T20:08:34Z", GoVersion:"go1.10.2", Compiler:"gc", Platform:"linux/amd64"} + ginko_params='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/junit.xml' + [[ k8s-1.11.0-dev =~ windows.* ]] + FUNC_TEST_ARGS='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-dev/junit.xml' + make functest hack/dockerized "hack/build-func-tests.sh" sha256:ceba12cbc33e4e37a707840478a630db561e2427b78c8c9f9cd6d0b73276ab32 go version go1.10 linux/amd64 go version go1.10 linux/amd64 Compiling tests... compiled tests.test hack/functests.sh Running Suite: Tests Suite ========================== Random Seed: 1533202667 Will run 151 of 151 specs Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T09:37:38.890031Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:37:39.034713Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:37:43.335123Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:37:43 http: TLS handshake error from 10.244.1.1:35338: EOF 2018/08/02 09:37:53 http: TLS handshake error from 10.244.1.1:35344: EOF level=info timestamp=2018-08-02T09:38:01.880925Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:38:01.907082Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:38:03 http: TLS handshake error from 10.244.1.1:35350: EOF level=info timestamp=2018-08-02T09:38:08.608068Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:38:09.467691Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:38:10.295265Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:38:13.402633Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:38:13 http: TLS handshake error from 10.244.1.1:35356: EOF level=info timestamp=2018-08-02T09:38:15.040329Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:38:15.041519Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:35:43.721841Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-08-02T09:35:43.722110Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-02T09:35:43.727960Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T09:35:43.728026Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T09:35:43.728066Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T09:35:43.728149Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T09:37:47.777428Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikjw7c kind= uid=b6e532b7-9637-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:37:47.795968Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2sdx2 kind= uid=b71a1c56-9637-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:37:47.800551Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwhkvn kind= uid=b718db1a-9637-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:37:47.800746Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwhkvn kind= uid=b718db1a-9637-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:37:47.801244Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2sdx2 kind= uid=b71a1c56-9637-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:37:47.801586Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikjw7c kind= uid=b6e532b7-9637-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:37:47.820465Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:37:47.820584Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:37:48.271322Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqjjnl\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqjjnl" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T09:38:11.326096Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqjjnl" level=info timestamp=2018-08-02T09:38:12.607040Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqjjnl, existing: true\n" level=info timestamp=2018-08-02T09:38:12.607133Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:38:12.607156Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:38:12.607253Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:38:12.607526Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock" level=error timestamp=2018-08-02T09:38:12.607598Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:38:12.607664Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqjjnl" level=info timestamp=2018-08-02T09:38:15.168662Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqjjnl, existing: true\n" level=info timestamp=2018-08-02T09:38:15.168752Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:38:15.168775Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:38:15.168853Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:38:15.169426Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock" level=error timestamp=2018-08-02T09:38:15.169853Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:38:15.170161Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqjjnl" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T09:38:15.410550Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikjw7c/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmikjw7c" level=info timestamp=2018-08-02T09:38:15.458141Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiwhkvn, existing: true\n" level=info timestamp=2018-08-02T09:38:15.458224Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:38:15.458247Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:38:15.458369Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiwhkvn kind= uid=b718db1a-9637-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:38:15.458644Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiwhkvn kind= uid=b718db1a-9637-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiwhkvn/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiwhkvn/sock" level=error timestamp=2018-08-02T09:38:15.458740Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiwhkvn kind= uid=b718db1a-9637-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiwhkvn/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:38:15.458847Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiwhkvn/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiwhkvn" level=info timestamp=2018-08-02T09:38:15.570735Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmikjw7c, existing: true\n" level=info timestamp=2018-08-02T09:38:15.570802Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:38:15.570824Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:38:15.570909Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmikjw7c kind= uid=b6e532b7-9637-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:38:15.571096Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmikjw7c kind= uid=b6e532b7-9637-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikjw7c/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmikjw7c/sock" level=error timestamp=2018-08-02T09:38:15.571157Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmikjw7c kind= uid=b6e532b7-9637-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikjw7c/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:38:15.571277Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikjw7c/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmikjw7c" Pod name: virt-launcher-testvmi2sdx2-g7bxg Pod phase: Running level=info timestamp=2018-08-02T09:37:56.253318Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:37:56.255103Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:37:56.257294Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:38:14.012302Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:38:14.042037Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi2sdx2" level=info timestamp=2018-08-02T09:38:14.045453Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:38:14.045683Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmikjw7c-6mfk7 Pod phase: Running level=info timestamp=2018-08-02T09:37:55.606895Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:37:55.607292Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:37:55.609533Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:38:14.270475Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:38:14.298691Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmikjw7c" level=info timestamp=2018-08-02T09:38:14.300082Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:38:14.300252Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiqjjnl-6kfnh Pod phase: Running level=info timestamp=2018-08-02T09:37:54.273727Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:37:54.274665Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:37:54.276788Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:38:04.300958Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:38:04.357405Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiqjjnl" level=info timestamp=2018-08-02T09:38:04.358954Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:38:04.359116Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiwhkvn-gbhvv Pod phase: Running level=info timestamp=2018-08-02T09:37:55.301506Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:37:55.304119Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:37:55.318274Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:38:13.429643Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:38:13.461190Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiwhkvn" level=info timestamp=2018-08-02T09:38:13.464353Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:38:13.464531Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T09:40:02.089495Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:40:02.119552Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:40:03 http: TLS handshake error from 10.244.1.1:35422: EOF level=info timestamp=2018-08-02T09:40:09.455920Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:40:09.552283Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:40:13.630313Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:40:13 http: TLS handshake error from 10.244.1.1:35428: EOF 2018/08/02 09:40:23 http: TLS handshake error from 10.244.1.1:35434: EOF level=info timestamp=2018-08-02T09:40:32.143305Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:40:32.171914Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:40:33 http: TLS handshake error from 10.244.1.1:35440: EOF level=info timestamp=2018-08-02T09:40:39.461385Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:40:39.571165Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:40:43.683968Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:40:43 http: TLS handshake error from 10.244.1.1:35446: EOF Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:35:43.721841Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-08-02T09:35:43.722110Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-02T09:35:43.727960Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-02T09:35:43.728026Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-02T09:35:43.728066Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-02T09:35:43.728149Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-02T09:37:47.777428Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikjw7c kind= uid=b6e532b7-9637-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:37:47.795968Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2sdx2 kind= uid=b71a1c56-9637-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:37:47.800551Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwhkvn kind= uid=b718db1a-9637-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:37:47.800746Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwhkvn kind= uid=b718db1a-9637-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:37:47.801244Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2sdx2 kind= uid=b71a1c56-9637-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:37:47.801586Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikjw7c kind= uid=b6e532b7-9637-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:37:47.820465Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:37:47.820584Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:37:48.271322Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqjjnl\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqjjnl" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T09:38:30.532254Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqjjnl" level=info timestamp=2018-08-02T09:38:51.012500Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqjjnl, existing: true\n" level=info timestamp=2018-08-02T09:38:51.012587Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:38:51.012615Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:38:51.012714Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:38:51.013013Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock" level=error timestamp=2018-08-02T09:38:51.013084Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:38:51.013174Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqjjnl" level=info timestamp=2018-08-02T09:39:31.973435Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqjjnl, existing: true\n" level=info timestamp=2018-08-02T09:39:31.973535Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:39:31.973560Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:39:31.973641Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:39:31.974338Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock" level=error timestamp=2018-08-02T09:39:31.974416Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:39:31.974507Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqjjnl/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqjjnl" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T09:39:37.070845Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiwhkvn/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiwhkvn" level=info timestamp=2018-08-02T09:39:37.187380Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmikjw7c, existing: true\n" level=info timestamp=2018-08-02T09:39:37.187462Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:39:37.187485Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:39:37.187563Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmikjw7c kind= uid=b6e532b7-9637-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:39:37.187849Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmikjw7c kind= uid=b6e532b7-9637-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikjw7c/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmikjw7c/sock" level=error timestamp=2018-08-02T09:39:37.187952Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmikjw7c kind= uid=b6e532b7-9637-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikjw7c/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:39:37.188418Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikjw7c/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmikjw7c" level=info timestamp=2018-08-02T09:39:37.749240Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi2sdx2, existing: true\n" level=info timestamp=2018-08-02T09:39:37.749335Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:39:37.749371Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:39:37.749471Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi2sdx2 kind= uid=b71a1c56-9637-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:39:37.749782Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi2sdx2 kind= uid=b71a1c56-9637-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi2sdx2/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi2sdx2/sock" level=error timestamp=2018-08-02T09:39:37.749859Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi2sdx2 kind= uid=b71a1c56-9637-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi2sdx2/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:39:37.749947Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi2sdx2/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi2sdx2" Pod name: virt-launcher-testvmi2sdx2-g7bxg Pod phase: Running level=info timestamp=2018-08-02T09:37:56.253318Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:37:56.255103Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:37:56.257294Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:38:14.012302Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:38:14.042037Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi2sdx2" level=info timestamp=2018-08-02T09:38:14.045453Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:38:14.045683Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmikjw7c-6mfk7 Pod phase: Running level=info timestamp=2018-08-02T09:37:55.606895Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:37:55.607292Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:37:55.609533Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:38:14.270475Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:38:14.298691Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmikjw7c" level=info timestamp=2018-08-02T09:38:14.300082Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:38:14.300252Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiqjjnl-6kfnh Pod phase: Running level=info timestamp=2018-08-02T09:37:54.273727Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:37:54.274665Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:37:54.276788Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:38:04.300958Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:38:04.357405Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiqjjnl" level=info timestamp=2018-08-02T09:38:04.358954Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:38:04.359116Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiwhkvn-gbhvv Pod phase: Running level=info timestamp=2018-08-02T09:37:55.301506Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:37:55.304119Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:37:55.318274Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:38:13.429643Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:38:13.461190Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiwhkvn" level=info timestamp=2018-08-02T09:38:13.464353Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:38:13.464531Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [181.268 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be able to reach [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 the Inbound VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Unexpected Warning event received: testvmikjw7c,b6e532b7-9637-11e8-8bc5-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmikjw7c/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T09:37:48.469131Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmikjw7c kind=VirtualMachineInstance uid=b6e532b7-9637-11e8-8bc5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmikjw7c-6mfk7" level=info timestamp=2018-08-02T09:38:15.748325Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmikjw7c kind=VirtualMachineInstance uid=b6e532b7-9637-11e8-8bc5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmikjw7c-6mfk7" level=error timestamp=2018-08-02T09:38:15.759561Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmikjw7c kind=VirtualMachineInstance uid=b6e532b7-9637-11e8-8bc5-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikjw7c/sock: connect: no such file or directory" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running 2018/08/02 09:40:13 http: TLS handshake error from 10.244.1.1:35428: EOF 2018/08/02 09:40:23 http: TLS handshake error from 10.244.1.1:35434: EOF level=info timestamp=2018-08-02T09:40:32.143305Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:40:32.171914Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:40:33 http: TLS handshake error from 10.244.1.1:35440: EOF level=info timestamp=2018-08-02T09:40:39.461385Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:40:39.571165Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:40:43.683968Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:40:43 http: TLS handshake error from 10.244.1.1:35446: EOF 2018/08/02 09:40:53 http: TLS handshake error from 10.244.1.1:35452: EOF level=info timestamp=2018-08-02T09:41:02.207862Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:41:02.235258Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:41:03 http: TLS handshake error from 10.244.1.1:35458: EOF level=info timestamp=2018-08-02T09:41:09.437483Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:41:09.592840Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:37:47.801586Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikjw7c kind= uid=b6e532b7-9637-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:37:47.820465Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:37:47.820584Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:37:48.271322Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqjjnl\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqjjnl" level=info timestamp=2018-08-02T09:40:49.180223Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidjkvj kind= uid=234dec19-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:40:49.181412Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidjkvj kind= uid=234dec19-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:40:49.193127Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijj799 kind= uid=235030fe-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:40:49.193249Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijj799 kind= uid=235030fe-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:40:49.204635Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:40:49.204781Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:40:49.236230Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik5fqq kind= uid=23548b78-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:40:49.236344Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik5fqq kind= uid=23548b78-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:40:49.269368Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6w6hf\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6w6hf" level=info timestamp=2018-08-02T09:40:49.269476Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidjkvj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidjkvj" level=info timestamp=2018-08-02T09:40:49.270011Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijj799\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijj799" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T09:41:05.333497Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi6w6hf" level=info timestamp=2018-08-02T09:41:06.613750Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi6w6hf, existing: true\n" level=info timestamp=2018-08-02T09:41:06.613844Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:41:06.613879Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:41:06.614021Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:41:06.614322Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock" level=error timestamp=2018-08-02T09:41:06.614397Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:41:06.614487Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi6w6hf" level=info timestamp=2018-08-02T09:41:09.174716Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi6w6hf, existing: true\n" level=info timestamp=2018-08-02T09:41:09.174805Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:41:09.174836Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:41:09.174965Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:41:09.175219Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock" level=error timestamp=2018-08-02T09:41:09.175308Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:41:09.175413Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi6w6hf" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T09:41:10.331942Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik5fqq/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmik5fqq" level=info timestamp=2018-08-02T09:41:10.350108Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmidjkvj, existing: true\n" level=info timestamp=2018-08-02T09:41:10.350407Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:41:10.350464Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:41:10.350536Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmidjkvj kind= uid=234dec19-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:41:10.350758Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmidjkvj kind= uid=234dec19-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidjkvj/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmidjkvj/sock" level=error timestamp=2018-08-02T09:41:10.350836Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmidjkvj kind= uid=234dec19-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidjkvj/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:41:10.350918Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidjkvj/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmidjkvj" level=info timestamp=2018-08-02T09:41:10.372116Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmik5fqq, existing: true\n" level=info timestamp=2018-08-02T09:41:10.372159Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:41:10.372182Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:41:10.372240Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmik5fqq kind= uid=23548b78-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:41:10.372367Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmik5fqq kind= uid=23548b78-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik5fqq/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmik5fqq/sock" level=error timestamp=2018-08-02T09:41:10.372425Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmik5fqq kind= uid=23548b78-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik5fqq/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:41:10.372498Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik5fqq/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmik5fqq" Pod name: virt-launcher-testvmi6w6hf-bktdc Pod phase: Running level=info timestamp=2018-08-02T09:40:53.162674Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:40:53.163347Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:40:53.164933Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:41:03.173256Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:41:03.215401Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi6w6hf" level=info timestamp=2018-08-02T09:41:03.218536Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:41:03.218793Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmidjkvj-tqbqv Pod phase: Running level=info timestamp=2018-08-02T09:40:55.952292Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:40:55.952589Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:40:55.955529Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:41:08.529467Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:41:08.671333Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmidjkvj" level=info timestamp=2018-08-02T09:41:08.675739Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:41:08.675911Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmijj799-bvdvd Pod phase: Running level=info timestamp=2018-08-02T09:40:54.643741Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:40:54.644501Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:40:54.660906Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:41:08.591783Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:41:08.694711Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmijj799" level=info timestamp=2018-08-02T09:41:08.696336Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:41:08.704182Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmik5fqq-6j6zb Pod phase: Running level=info timestamp=2018-08-02T09:40:56.263782Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:40:56.264069Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:40:56.265813Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:41:09.278139Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:41:09.301743Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmik5fqq" level=info timestamp=2018-08-02T09:41:09.303080Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:41:09.303253Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running 2018/08/02 09:43:03 http: TLS handshake error from 10.244.1.1:35530: EOF level=info timestamp=2018-08-02T09:43:09.456229Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:43:09.841392Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:43:13 http: TLS handshake error from 10.244.1.1:35536: EOF level=info timestamp=2018-08-02T09:43:13.974582Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:43:14.701867Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:43:14.703153Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:43:23 http: TLS handshake error from 10.244.1.1:35542: EOF level=info timestamp=2018-08-02T09:43:32.459983Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:43:32.485292Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:43:33 http: TLS handshake error from 10.244.1.1:35548: EOF level=info timestamp=2018-08-02T09:43:39.439438Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:43:39.866387Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:43:43 http: TLS handshake error from 10.244.1.1:35554: EOF level=info timestamp=2018-08-02T09:43:44.028044Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:37:47.801586Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikjw7c kind= uid=b6e532b7-9637-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:37:47.820465Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:37:47.820584Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqjjnl kind= uid=b7254f3c-9637-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:37:48.271322Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqjjnl\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqjjnl" level=info timestamp=2018-08-02T09:40:49.180223Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidjkvj kind= uid=234dec19-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:40:49.181412Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidjkvj kind= uid=234dec19-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:40:49.193127Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijj799 kind= uid=235030fe-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:40:49.193249Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijj799 kind= uid=235030fe-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:40:49.204635Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:40:49.204781Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:40:49.236230Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik5fqq kind= uid=23548b78-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:40:49.236344Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik5fqq kind= uid=23548b78-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:40:49.269368Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6w6hf\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6w6hf" level=info timestamp=2018-08-02T09:40:49.269476Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidjkvj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidjkvj" level=info timestamp=2018-08-02T09:40:49.270011Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijj799\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijj799" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T09:41:45.020485Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi6w6hf" level=info timestamp=2018-08-02T09:42:25.982241Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi6w6hf, existing: true\n" level=info timestamp=2018-08-02T09:42:25.982341Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:42:25.982366Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:42:25.982448Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:42:25.982663Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock" level=error timestamp=2018-08-02T09:42:25.982743Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:42:25.982819Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi6w6hf" level=info timestamp=2018-08-02T09:43:47.904585Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi6w6hf, existing: true\n" level=info timestamp=2018-08-02T09:43:47.904700Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:43:47.904742Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:43:47.904844Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:43:47.905183Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock" level=error timestamp=2018-08-02T09:43:47.905270Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:43:47.905381Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6w6hf/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi6w6hf" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T09:42:30.789946Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmijj799/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmijj799" level=info timestamp=2018-08-02T09:42:32.201834Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmidjkvj, existing: true\n" level=info timestamp=2018-08-02T09:42:32.201918Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:42:32.201942Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:42:32.202026Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmidjkvj kind= uid=234dec19-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:42:32.202301Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmidjkvj kind= uid=234dec19-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidjkvj/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmidjkvj/sock" level=error timestamp=2018-08-02T09:42:32.202373Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmidjkvj kind= uid=234dec19-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidjkvj/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:42:32.202453Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidjkvj/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmidjkvj" level=info timestamp=2018-08-02T09:42:32.224399Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmik5fqq, existing: true\n" level=info timestamp=2018-08-02T09:42:32.224469Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:42:32.224491Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:42:32.224595Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmik5fqq kind= uid=23548b78-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:42:32.224854Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmik5fqq kind= uid=23548b78-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik5fqq/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmik5fqq/sock" level=error timestamp=2018-08-02T09:42:32.224922Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmik5fqq kind= uid=23548b78-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik5fqq/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:42:32.224997Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik5fqq/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmik5fqq" Pod name: virt-launcher-testvmi6w6hf-bktdc Pod phase: Running level=info timestamp=2018-08-02T09:40:53.162674Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:40:53.163347Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:40:53.164933Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:41:03.173256Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:41:03.215401Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi6w6hf" level=info timestamp=2018-08-02T09:41:03.218536Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:41:03.218793Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmidjkvj-tqbqv Pod phase: Running level=info timestamp=2018-08-02T09:40:55.952292Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:40:55.952589Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:40:55.955529Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:41:08.529467Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:41:08.671333Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmidjkvj" level=info timestamp=2018-08-02T09:41:08.675739Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:41:08.675911Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmijj799-bvdvd Pod phase: Running level=info timestamp=2018-08-02T09:40:54.643741Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:40:54.644501Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:40:54.660906Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:41:08.591783Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:41:08.694711Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmijj799" level=info timestamp=2018-08-02T09:41:08.696336Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:41:08.704182Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmik5fqq-6j6zb Pod phase: Running level=info timestamp=2018-08-02T09:40:56.263782Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:40:56.264069Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:40:56.265813Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:41:09.278139Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:41:09.301743Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmik5fqq" level=info timestamp=2018-08-02T09:41:09.303080Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:41:09.303253Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [181.539 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be able to reach [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 the Inbound VirtualMachineInstance with pod network connectivity explicitly set /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Unexpected Warning event received: testvmidjkvj,234dec19-9638-11e8-8bc5-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmidjkvj/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T09:40:49.811454Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmidjkvj kind=VirtualMachineInstance uid=234dec19-9638-11e8-8bc5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmidjkvj-tqbqv" level=info timestamp=2018-08-02T09:41:10.756433Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmidjkvj kind=VirtualMachineInstance uid=234dec19-9638-11e8-8bc5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmidjkvj-tqbqv" level=error timestamp=2018-08-02T09:41:10.784565Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmidjkvj kind=VirtualMachineInstance uid=234dec19-9638-11e8-8bc5-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidjkvj/sock: connect: no such file or directory" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T09:43:14.703153Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:43:23 http: TLS handshake error from 10.244.1.1:35542: EOF level=info timestamp=2018-08-02T09:43:32.459983Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:43:32.485292Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:43:33 http: TLS handshake error from 10.244.1.1:35548: EOF level=info timestamp=2018-08-02T09:43:39.439438Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:43:39.866387Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:43:43 http: TLS handshake error from 10.244.1.1:35554: EOF level=info timestamp=2018-08-02T09:43:44.028044Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:43:53 http: TLS handshake error from 10.244.1.1:35560: EOF level=info timestamp=2018-08-02T09:44:02.517566Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:44:02.535789Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:44:03 http: TLS handshake error from 10.244.1.1:35566: EOF level=info timestamp=2018-08-02T09:44:09.438964Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:44:09.882117Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:40:49.204781Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:40:49.236230Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik5fqq kind= uid=23548b78-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:40:49.236344Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik5fqq kind= uid=23548b78-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:40:49.269368Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6w6hf\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6w6hf" level=info timestamp=2018-08-02T09:40:49.269476Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidjkvj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidjkvj" level=info timestamp=2018-08-02T09:40:49.270011Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijj799\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijj799" level=info timestamp=2018-08-02T09:43:50.713310Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvminhzmn kind= uid=8f82610b-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:43:50.714697Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvminhzmn kind= uid=8f82610b-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:43:50.719984Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7tsw8 kind= uid=8f83cd80-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:43:50.720136Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7tsw8 kind= uid=8f83cd80-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:43:50.730376Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:43:50.730484Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:43:50.745821Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipczfp kind= uid=8f86b5ec-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:43:50.745910Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipczfp kind= uid=8f86b5ec-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:43:51.326928Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipczfp\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipczfp" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T09:44:06.542643Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmibnx4r/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibnx4r" level=info timestamp=2018-08-02T09:44:07.822854Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibnx4r, existing: true\n" level=info timestamp=2018-08-02T09:44:07.822985Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:44:07.823015Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:44:07.823125Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:44:07.823454Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmibnx4r/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmibnx4r/sock" level=error timestamp=2018-08-02T09:44:07.823539Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmibnx4r/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:44:07.823637Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmibnx4r/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibnx4r" level=info timestamp=2018-08-02T09:44:10.383973Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibnx4r, existing: true\n" level=info timestamp=2018-08-02T09:44:10.384061Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:44:10.384092Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:44:10.384194Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:44:10.384538Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmibnx4r/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmibnx4r/sock" level=error timestamp=2018-08-02T09:44:10.384618Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmibnx4r/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:44:10.384719Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmibnx4r/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibnx4r" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T09:44:10.605455Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvminhzmn/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvminhzmn" level=info timestamp=2018-08-02T09:44:10.625599Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvminhzmn, existing: true\n" level=info timestamp=2018-08-02T09:44:10.625624Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:44:10.625645Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:44:10.625744Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvminhzmn kind= uid=8f82610b-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:44:10.625844Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvminhzmn kind= uid=8f82610b-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvminhzmn/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvminhzmn/sock" level=error timestamp=2018-08-02T09:44:10.625895Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvminhzmn kind= uid=8f82610b-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvminhzmn/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:44:10.625958Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvminhzmn/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvminhzmn" level=info timestamp=2018-08-02T09:44:10.666116Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvminhzmn, existing: true\n" level=info timestamp=2018-08-02T09:44:10.666154Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:44:10.666214Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:44:10.666271Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvminhzmn kind= uid=8f82610b-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:44:10.666361Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvminhzmn kind= uid=8f82610b-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvminhzmn/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvminhzmn/sock" level=error timestamp=2018-08-02T09:44:10.666408Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvminhzmn kind= uid=8f82610b-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvminhzmn/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:44:10.666482Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvminhzmn/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvminhzmn" Pod name: virt-launcher-testvmi7tsw8-rmd4g Pod phase: Running level=info timestamp=2018-08-02T09:43:57.548773Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:43:57.549184Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:43:57.550550Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:44:10.351102Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:44:10.391577Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi7tsw8" level=info timestamp=2018-08-02T09:44:10.395202Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:44:10.395360Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmibnx4r-72f92 Pod phase: Running level=info timestamp=2018-08-02T09:43:54.141090Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:43:54.142424Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:43:54.144111Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:44:04.152091Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:44:04.203519Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmibnx4r" level=info timestamp=2018-08-02T09:44:04.206365Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:44:04.206541Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvminhzmn-tdjfr Pod phase: Running level=info timestamp=2018-08-02T09:43:57.181410Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:43:57.182302Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:43:57.190283Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:44:10.165759Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:44:10.205084Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvminhzmn" level=info timestamp=2018-08-02T09:44:10.207361Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:44:10.207583Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmipczfp-mjzkk Pod phase: Running level=info timestamp=2018-08-02T09:43:57.176159Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:43:57.176997Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:43:57.178381Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:44:09.718122Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:44:09.855887Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmipczfp" level=info timestamp=2018-08-02T09:44:09.859804Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:44:09.860043Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running 2018/08/02 09:46:03 http: TLS handshake error from 10.244.1.1:35638: EOF level=info timestamp=2018-08-02T09:46:09.463238Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:46:09.963799Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:46:13 http: TLS handshake error from 10.244.1.1:35644: EOF level=info timestamp=2018-08-02T09:46:14.296560Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:46:14.707835Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:46:14.710768Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:46:23 http: TLS handshake error from 10.244.1.1:35650: EOF level=info timestamp=2018-08-02T09:46:32.771390Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:46:32.792709Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:46:33 http: TLS handshake error from 10.244.1.1:35656: EOF level=info timestamp=2018-08-02T09:46:39.494343Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:46:39.985969Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:46:43 http: TLS handshake error from 10.244.1.1:35662: EOF level=info timestamp=2018-08-02T09:46:44.355618Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:40:49.204781Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6w6hf kind= uid=2351bc37-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:40:49.236230Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik5fqq kind= uid=23548b78-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:40:49.236344Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik5fqq kind= uid=23548b78-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:40:49.269368Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6w6hf\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6w6hf" level=info timestamp=2018-08-02T09:40:49.269476Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidjkvj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidjkvj" level=info timestamp=2018-08-02T09:40:49.270011Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijj799\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijj799" level=info timestamp=2018-08-02T09:43:50.713310Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvminhzmn kind= uid=8f82610b-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:43:50.714697Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvminhzmn kind= uid=8f82610b-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:43:50.719984Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7tsw8 kind= uid=8f83cd80-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:43:50.720136Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7tsw8 kind= uid=8f83cd80-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:43:50.730376Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:43:50.730484Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:43:50.745821Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipczfp kind= uid=8f86b5ec-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:43:50.745910Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipczfp kind= uid=8f86b5ec-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:43:51.326928Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipczfp\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipczfp" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T09:45:27.188551Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:45:27.188833Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmibnx4r/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmibnx4r/sock" level=error timestamp=2018-08-02T09:45:27.194381Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmibnx4r/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:45:27.194534Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmibnx4r/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibnx4r" level=info timestamp=2018-08-02T09:46:31.745721Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi6w6hf, existing: false\n" level=info timestamp=2018-08-02T09:46:31.745821Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:46:31.745986Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi6w6hf kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:46:31.746108Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi6w6hf kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:46:49.114861Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibnx4r, existing: true\n" level=info timestamp=2018-08-02T09:46:49.114965Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:46:49.114996Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:46:49.115098Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:46:49.115429Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmibnx4r/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmibnx4r/sock" level=error timestamp=2018-08-02T09:46:49.115519Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmibnx4r/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:46:49.115610Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmibnx4r/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibnx4r" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T09:45:32.519637Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvminhzmn/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvminhzmn" level=info timestamp=2018-08-02T09:45:32.871681Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipczfp, existing: true\n" level=info timestamp=2018-08-02T09:45:32.871780Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:45:32.871812Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:45:32.871917Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmipczfp kind= uid=8f86b5ec-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:45:32.872234Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmipczfp kind= uid=8f86b5ec-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipczfp/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmipczfp/sock" level=error timestamp=2018-08-02T09:45:32.872330Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmipczfp kind= uid=8f86b5ec-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipczfp/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:45:32.872423Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipczfp/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmipczfp" level=info timestamp=2018-08-02T09:45:33.052234Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi7tsw8, existing: true\n" level=info timestamp=2018-08-02T09:45:33.052358Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:45:33.052398Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:45:33.052496Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi7tsw8 kind= uid=8f83cd80-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:45:33.053053Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi7tsw8 kind= uid=8f83cd80-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7tsw8/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi7tsw8/sock" level=error timestamp=2018-08-02T09:45:33.053144Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi7tsw8 kind= uid=8f83cd80-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7tsw8/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:45:33.053238Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7tsw8/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi7tsw8" Pod name: virt-launcher-testvmi7tsw8-rmd4g Pod phase: Running level=info timestamp=2018-08-02T09:43:57.548773Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:43:57.549184Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:43:57.550550Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:44:10.351102Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:44:10.391577Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi7tsw8" level=info timestamp=2018-08-02T09:44:10.395202Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:44:10.395360Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmibnx4r-72f92 Pod phase: Running level=info timestamp=2018-08-02T09:43:54.141090Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:43:54.142424Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:43:54.144111Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:44:04.152091Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:44:04.203519Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmibnx4r" level=info timestamp=2018-08-02T09:44:04.206365Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:44:04.206541Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvminhzmn-tdjfr Pod phase: Running level=info timestamp=2018-08-02T09:43:57.181410Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:43:57.182302Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:43:57.190283Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:44:10.165759Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:44:10.205084Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvminhzmn" level=info timestamp=2018-08-02T09:44:10.207361Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:44:10.207583Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmipczfp-mjzkk Pod phase: Running level=info timestamp=2018-08-02T09:43:57.176159Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:43:57.176997Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:43:57.178381Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:44:09.718122Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:44:09.855887Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmipczfp" level=info timestamp=2018-08-02T09:44:09.859804Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:44:09.860043Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [181.505 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be able to reach [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 the Inbound VirtualMachineInstance with custom MAC address /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Unexpected Warning event received: testvminhzmn,8f82610b-9638-11e8-8bc5-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvminhzmn/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T09:43:51.611535Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvminhzmn kind=VirtualMachineInstance uid=8f82610b-9638-11e8-8bc5-525500d15501 msg="Created virtual machine pod virt-launcher-testvminhzmn-tdjfr" level=info timestamp=2018-08-02T09:44:11.083632Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvminhzmn kind=VirtualMachineInstance uid=8f82610b-9638-11e8-8bc5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvminhzmn-tdjfr" level=error timestamp=2018-08-02T09:44:11.094620Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvminhzmn kind=VirtualMachineInstance uid=8f82610b-9638-11e8-8bc5-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvminhzmn/sock: connect: no such file or directory" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T09:46:14.710768Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:46:23 http: TLS handshake error from 10.244.1.1:35650: EOF level=info timestamp=2018-08-02T09:46:32.771390Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:46:32.792709Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:46:33 http: TLS handshake error from 10.244.1.1:35656: EOF level=info timestamp=2018-08-02T09:46:39.494343Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:46:39.985969Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:46:43 http: TLS handshake error from 10.244.1.1:35662: EOF level=info timestamp=2018-08-02T09:46:44.355618Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:46:53 http: TLS handshake error from 10.244.1.1:35668: EOF level=info timestamp=2018-08-02T09:47:02.882224Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:47:02.883580Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:47:03 http: TLS handshake error from 10.244.1.1:35674: EOF level=info timestamp=2018-08-02T09:47:09.455999Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:47:10.009058Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:43:50.730484Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:43:50.745821Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipczfp kind= uid=8f86b5ec-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:43:50.745910Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipczfp kind= uid=8f86b5ec-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:43:51.326928Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipczfp\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipczfp" level=info timestamp=2018-08-02T09:46:52.233874Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiz2njq kind= uid=fbb4049b-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:46:52.235883Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiz2njq kind= uid=fbb4049b-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:46:52.249214Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivpwr8 kind= uid=fbb5d436-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:46:52.249339Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivpwr8 kind= uid=fbb5d436-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:46:52.266513Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5knbz kind= uid=fbb8789f-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:46:52.266679Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5knbz kind= uid=fbb8789f-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:46:52.299152Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4kksh kind= uid=fbbb110c-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:46:52.299237Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4kksh kind= uid=fbbb110c-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:46:52.322076Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivpwr8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivpwr8" level=info timestamp=2018-08-02T09:46:52.322184Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiz2njq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiz2njq" level=info timestamp=2018-08-02T09:46:53.046070Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4kksh\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4kksh" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T09:47:07.980063Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi5knbz/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi5knbz" level=info timestamp=2018-08-02T09:47:09.260339Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi5knbz, existing: true\n" level=info timestamp=2018-08-02T09:47:09.260430Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:47:09.260456Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:47:09.260595Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi5knbz kind= uid=fbb8789f-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:47:09.260981Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi5knbz kind= uid=fbb8789f-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi5knbz/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi5knbz/sock" level=error timestamp=2018-08-02T09:47:09.261064Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi5knbz kind= uid=fbb8789f-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi5knbz/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:47:09.261142Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi5knbz/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi5knbz" level=info timestamp=2018-08-02T09:47:11.822070Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi5knbz, existing: true\n" level=info timestamp=2018-08-02T09:47:11.822154Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:47:11.822179Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:47:11.822277Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi5knbz kind= uid=fbb8789f-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:47:11.822589Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi5knbz kind= uid=fbb8789f-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi5knbz/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi5knbz/sock" level=error timestamp=2018-08-02T09:47:11.822663Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi5knbz kind= uid=fbb8789f-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi5knbz/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:47:11.822746Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi5knbz/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi5knbz" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T09:47:12.338258Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiz2njq/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiz2njq" level=info timestamp=2018-08-02T09:47:12.358413Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiz2njq, existing: true\n" level=info timestamp=2018-08-02T09:47:12.358443Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:47:12.358468Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:47:12.358516Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiz2njq kind= uid=fbb4049b-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:47:12.358740Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiz2njq kind= uid=fbb4049b-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiz2njq/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiz2njq/sock" level=error timestamp=2018-08-02T09:47:12.358802Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiz2njq kind= uid=fbb4049b-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiz2njq/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:47:12.358872Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiz2njq/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiz2njq" level=info timestamp=2018-08-02T09:47:12.399158Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiz2njq, existing: true\n" level=info timestamp=2018-08-02T09:47:12.399239Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:47:12.399262Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:47:12.399339Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiz2njq kind= uid=fbb4049b-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:47:12.399555Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiz2njq kind= uid=fbb4049b-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiz2njq/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiz2njq/sock" level=error timestamp=2018-08-02T09:47:12.399614Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiz2njq kind= uid=fbb4049b-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiz2njq/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:47:12.399713Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiz2njq/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiz2njq" Pod name: virt-launcher-testvmi4kksh-nndxk Pod phase: Running level=info timestamp=2018-08-02T09:46:58.740361Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:46:58.742565Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:46:58.744238Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:47:11.995563Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:47:12.021620Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi4kksh" level=info timestamp=2018-08-02T09:47:12.023447Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:47:12.023708Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmi5knbz-wxxgn Pod phase: Running level=info timestamp=2018-08-02T09:46:56.027068Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:46:56.028728Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:46:56.030244Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:47:06.038100Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:47:06.059188Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi5knbz" level=info timestamp=2018-08-02T09:47:06.060461Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:47:06.060617Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivpwr8-45x7j Pod phase: Running level=info timestamp=2018-08-02T09:46:57.794521Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:46:57.797308Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:46:57.800032Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:47:11.612689Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:47:11.735291Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivpwr8" level=info timestamp=2018-08-02T09:47:11.737091Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:47:11.737258Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiz2njq-skk29 Pod phase: Running level=info timestamp=2018-08-02T09:46:59.135470Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:46:59.136746Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:46:59.156483Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:47:11.618017Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:47:11.675847Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiz2njq" level=info timestamp=2018-08-02T09:47:11.679559Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:47:11.679731Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running 2018/08/02 09:49:03 http: TLS handshake error from 10.244.1.1:35746: EOF level=info timestamp=2018-08-02T09:49:09.452975Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:49:10.090176Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:49:13 http: TLS handshake error from 10.244.1.1:35752: EOF level=info timestamp=2018-08-02T09:49:14.625726Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:49:14.793145Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:49:14.794645Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:49:23 http: TLS handshake error from 10.244.1.1:35758: EOF level=info timestamp=2018-08-02T09:49:33.108050Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:49:33.121555Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:49:33 http: TLS handshake error from 10.244.1.1:35764: EOF level=info timestamp=2018-08-02T09:49:39.469528Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:49:40.111336Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:49:43 http: TLS handshake error from 10.244.1.1:35770: EOF level=info timestamp=2018-08-02T09:49:44.690466Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:43:50.730484Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibnx4r kind= uid=8f853a1a-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:43:50.745821Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipczfp kind= uid=8f86b5ec-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:43:50.745910Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipczfp kind= uid=8f86b5ec-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:43:51.326928Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipczfp\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipczfp" level=info timestamp=2018-08-02T09:46:52.233874Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiz2njq kind= uid=fbb4049b-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:46:52.235883Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiz2njq kind= uid=fbb4049b-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:46:52.249214Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivpwr8 kind= uid=fbb5d436-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:46:52.249339Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivpwr8 kind= uid=fbb5d436-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:46:52.266513Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5knbz kind= uid=fbb8789f-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:46:52.266679Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5knbz kind= uid=fbb8789f-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:46:52.299152Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4kksh kind= uid=fbbb110c-9638-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:46:52.299237Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4kksh kind= uid=fbbb110c-9638-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:46:52.322076Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivpwr8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivpwr8" level=info timestamp=2018-08-02T09:46:52.322184Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiz2njq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiz2njq" level=info timestamp=2018-08-02T09:46:53.046070Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4kksh\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4kksh" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T09:48:28.626596Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi5knbz kind= uid=fbb8789f-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:48:28.626846Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi5knbz kind= uid=fbb8789f-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi5knbz/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi5knbz/sock" level=error timestamp=2018-08-02T09:48:28.627082Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi5knbz kind= uid=fbb8789f-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi5knbz/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:48:28.627216Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi5knbz/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi5knbz" level=info timestamp=2018-08-02T09:49:32.956029Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibnx4r, existing: false\n" level=info timestamp=2018-08-02T09:49:32.956122Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:49:32.956211Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmibnx4r kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:49:32.956336Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmibnx4r kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:49:50.549454Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi5knbz, existing: true\n" level=info timestamp=2018-08-02T09:49:50.549547Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:49:50.549577Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:49:50.549690Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi5knbz kind= uid=fbb8789f-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:49:50.556183Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi5knbz kind= uid=fbb8789f-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi5knbz/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi5knbz/sock" level=error timestamp=2018-08-02T09:49:50.556321Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi5knbz kind= uid=fbb8789f-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi5knbz/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:49:50.556437Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi5knbz/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi5knbz" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T09:48:33.906193Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivpwr8/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivpwr8" level=info timestamp=2018-08-02T09:48:34.250498Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiz2njq, existing: true\n" level=info timestamp=2018-08-02T09:48:34.250608Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:48:34.250640Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:48:34.250767Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiz2njq kind= uid=fbb4049b-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:48:34.252194Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiz2njq kind= uid=fbb4049b-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiz2njq/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiz2njq/sock" level=error timestamp=2018-08-02T09:48:34.252278Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiz2njq kind= uid=fbb4049b-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiz2njq/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:48:34.252455Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiz2njq/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiz2njq" level=info timestamp=2018-08-02T09:48:35.315856Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4kksh, existing: true\n" level=info timestamp=2018-08-02T09:48:35.315947Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:48:35.315979Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:48:35.316765Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi4kksh kind= uid=fbbb110c-9638-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:48:35.317086Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi4kksh kind= uid=fbbb110c-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi4kksh/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi4kksh/sock" level=error timestamp=2018-08-02T09:48:35.317176Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi4kksh kind= uid=fbbb110c-9638-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi4kksh/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:48:35.317253Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi4kksh/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4kksh" Pod name: virt-launcher-testvmi4kksh-nndxk Pod phase: Running level=info timestamp=2018-08-02T09:46:58.740361Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:46:58.742565Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:46:58.744238Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:47:11.995563Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:47:12.021620Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi4kksh" level=info timestamp=2018-08-02T09:47:12.023447Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:47:12.023708Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmi5knbz-wxxgn Pod phase: Running level=info timestamp=2018-08-02T09:46:56.027068Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:46:56.028728Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:46:56.030244Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:47:06.038100Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:47:06.059188Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi5knbz" level=info timestamp=2018-08-02T09:47:06.060461Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:47:06.060617Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivpwr8-45x7j Pod phase: Running level=info timestamp=2018-08-02T09:46:57.794521Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:46:57.797308Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:46:57.800032Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:47:11.612689Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:47:11.735291Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivpwr8" level=info timestamp=2018-08-02T09:47:11.737091Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:47:11.737258Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiz2njq-skk29 Pod phase: Running level=info timestamp=2018-08-02T09:46:59.135470Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:46:59.136746Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:46:59.156483Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:47:11.618017Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:47:11.675847Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiz2njq" level=info timestamp=2018-08-02T09:47:11.679559Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:47:11.679731Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [181.537 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be able to reach [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 the internet /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Unexpected Warning event received: testvmiz2njq,fbb4049b-9638-11e8-8bc5-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmiz2njq/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T09:46:52.840620Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiz2njq kind=VirtualMachineInstance uid=fbb4049b-9638-11e8-8bc5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmiz2njq-skk29" level=info timestamp=2018-08-02T09:47:12.815006Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiz2njq kind=VirtualMachineInstance uid=fbb4049b-9638-11e8-8bc5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmiz2njq-skk29" level=error timestamp=2018-08-02T09:47:12.823203Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmiz2njq kind=VirtualMachineInstance uid=fbb4049b-9638-11e8-8bc5-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiz2njq/sock: connect: no such file or directory" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T09:49:33.108050Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:49:33.121555Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:49:33 http: TLS handshake error from 10.244.1.1:35764: EOF level=info timestamp=2018-08-02T09:49:39.469528Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:49:40.111336Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:49:43 http: TLS handshake error from 10.244.1.1:35770: EOF level=info timestamp=2018-08-02T09:49:44.690466Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:49:53 http: TLS handshake error from 10.244.1.1:35776: EOF level=info timestamp=2018-08-02T09:50:03.169535Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:50:03.170861Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:50:03 http: TLS handshake error from 10.244.1.1:35782: EOF level=info timestamp=2018-08-02T09:50:09.440153Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:50:10.132061Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:50:13 http: TLS handshake error from 10.244.1.1:35788: EOF level=info timestamp=2018-08-02T09:50:14.744340Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:46:52.322076Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivpwr8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivpwr8" level=info timestamp=2018-08-02T09:46:52.322184Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiz2njq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiz2njq" level=info timestamp=2018-08-02T09:46:53.046070Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4kksh\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4kksh" level=info timestamp=2018-08-02T09:49:53.763768Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi459k8 kind= uid=67e73223-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:49:53.764948Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi459k8 kind= uid=67e73223-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:49:53.774747Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi46949 kind= uid=67e92127-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:49:53.774868Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi46949 kind= uid=67e92127-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:49:53.785887Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiktjhs kind= uid=67ea7a0a-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:49:53.786008Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiktjhs kind= uid=67ea7a0a-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:49:53.809566Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifdmr9 kind= uid=67eca88d-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:49:53.809742Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifdmr9 kind= uid=67eca88d-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:49:53.839643Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi46949\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi46949" level=info timestamp=2018-08-02T09:49:53.977105Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiktjhs\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiktjhs" level=info timestamp=2018-08-02T09:49:54.576612Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifdmr9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifdmr9" level=info timestamp=2018-08-02T09:49:55.005262Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifdmr9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifdmr9" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T09:50:10.643648Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiktjhs/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiktjhs" level=info timestamp=2018-08-02T09:50:11.923950Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiktjhs, existing: true\n" level=info timestamp=2018-08-02T09:50:11.924045Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:50:11.924080Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:50:11.924216Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiktjhs kind= uid=67ea7a0a-9639-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:50:11.924566Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiktjhs kind= uid=67ea7a0a-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiktjhs/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiktjhs/sock" level=error timestamp=2018-08-02T09:50:11.924663Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiktjhs kind= uid=67ea7a0a-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiktjhs/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:50:11.924786Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiktjhs/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiktjhs" level=info timestamp=2018-08-02T09:50:14.491054Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiktjhs, existing: true\n" level=info timestamp=2018-08-02T09:50:14.491220Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:50:14.508821Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:50:14.508981Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiktjhs kind= uid=67ea7a0a-9639-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:50:14.509553Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiktjhs kind= uid=67ea7a0a-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiktjhs/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiktjhs/sock" level=error timestamp=2018-08-02T09:50:14.509630Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiktjhs kind= uid=67ea7a0a-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiktjhs/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:50:14.509707Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiktjhs/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiktjhs" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T09:50:15.012453Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi459k8/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi459k8" level=info timestamp=2018-08-02T09:50:15.053812Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi459k8, existing: true\n" level=info timestamp=2018-08-02T09:50:15.053891Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:50:15.053922Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:50:15.054012Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi459k8 kind= uid=67e73223-9639-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:50:15.054238Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi459k8 kind= uid=67e73223-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi459k8/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi459k8/sock" level=error timestamp=2018-08-02T09:50:15.054305Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi459k8 kind= uid=67e73223-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi459k8/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:50:15.054377Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi459k8/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi459k8" level=info timestamp=2018-08-02T09:50:15.134540Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi459k8, existing: true\n" level=info timestamp=2018-08-02T09:50:15.134614Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:50:15.134638Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:50:15.134734Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi459k8 kind= uid=67e73223-9639-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:50:15.134896Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi459k8 kind= uid=67e73223-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi459k8/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi459k8/sock" level=error timestamp=2018-08-02T09:50:15.134962Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi459k8 kind= uid=67e73223-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi459k8/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:50:15.135035Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi459k8/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi459k8" Pod name: virt-launcher-testvmi459k8-bt89t Pod phase: Running level=info timestamp=2018-08-02T09:50:00.376857Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:50:00.377135Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:50:00.379041Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:50:13.293349Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:50:13.346671Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi459k8" level=info timestamp=2018-08-02T09:50:13.348462Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:50:13.348618Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmi46949-n6ztv Pod phase: Running level=info timestamp=2018-08-02T09:49:59.310110Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:49:59.316233Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:49:59.341194Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:50:12.041823Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:50:12.100559Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi46949" level=info timestamp=2018-08-02T09:50:12.109520Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:50:12.109734Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmifdmr9-v2sdd Pod phase: Running level=info timestamp=2018-08-02T09:50:00.144303Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:50:00.144571Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:50:00.146094Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:50:13.112004Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:50:13.178722Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmifdmr9" level=info timestamp=2018-08-02T09:50:13.184243Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:50:13.184456Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiktjhs-8qcsm Pod phase: Running level=info timestamp=2018-08-02T09:49:57.725071Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:49:57.725377Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:49:57.727439Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:50:07.735842Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:50:07.810107Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiktjhs" level=info timestamp=2018-08-02T09:50:07.812346Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:50:07.812510Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T09:52:03.380730Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:52:03 http: TLS handshake error from 10.244.1.1:35854: EOF level=info timestamp=2018-08-02T09:52:09.443352Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:52:10.217779Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:52:13 http: TLS handshake error from 10.244.1.1:35860: EOF level=info timestamp=2018-08-02T09:52:14.964583Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:52:23 http: TLS handshake error from 10.244.1.1:35866: EOF level=info timestamp=2018-08-02T09:52:33.419813Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:52:33.441417Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:52:33 http: TLS handshake error from 10.244.1.1:35872: EOF level=info timestamp=2018-08-02T09:52:39.502971Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:52:40.244209Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:52:43 http: TLS handshake error from 10.244.1.1:35878: EOF level=info timestamp=2018-08-02T09:52:45.018458Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:52:53 http: TLS handshake error from 10.244.1.1:35884: EOF Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:46:52.322076Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivpwr8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivpwr8" level=info timestamp=2018-08-02T09:46:52.322184Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiz2njq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiz2njq" level=info timestamp=2018-08-02T09:46:53.046070Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4kksh\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4kksh" level=info timestamp=2018-08-02T09:49:53.763768Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi459k8 kind= uid=67e73223-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:49:53.764948Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi459k8 kind= uid=67e73223-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:49:53.774747Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi46949 kind= uid=67e92127-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:49:53.774868Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi46949 kind= uid=67e92127-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:49:53.785887Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiktjhs kind= uid=67ea7a0a-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:49:53.786008Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiktjhs kind= uid=67ea7a0a-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:49:53.809566Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifdmr9 kind= uid=67eca88d-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:49:53.809742Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifdmr9 kind= uid=67eca88d-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:49:53.839643Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi46949\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi46949" level=info timestamp=2018-08-02T09:49:53.977105Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiktjhs\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiktjhs" level=info timestamp=2018-08-02T09:49:54.576612Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifdmr9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifdmr9" level=info timestamp=2018-08-02T09:49:55.005262Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifdmr9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifdmr9" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T09:51:31.316285Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiktjhs kind= uid=67ea7a0a-9639-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:51:31.316549Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiktjhs kind= uid=67ea7a0a-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiktjhs/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiktjhs/sock" level=error timestamp=2018-08-02T09:51:31.316789Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiktjhs kind= uid=67ea7a0a-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiktjhs/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:51:31.316884Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiktjhs/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiktjhs" level=info timestamp=2018-08-02T09:52:34.396713Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi5knbz, existing: false\n" level=info timestamp=2018-08-02T09:52:34.396812Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:52:34.396957Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi5knbz kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:52:34.397097Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi5knbz kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:52:53.240128Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiktjhs, existing: true\n" level=info timestamp=2018-08-02T09:52:53.240242Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:52:53.240274Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:52:53.240352Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiktjhs kind= uid=67ea7a0a-9639-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:52:53.240604Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiktjhs kind= uid=67ea7a0a-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiktjhs/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiktjhs/sock" level=error timestamp=2018-08-02T09:52:53.240673Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiktjhs kind= uid=67ea7a0a-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiktjhs/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:52:53.240740Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiktjhs/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiktjhs" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T09:51:35.251467Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi46949/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi46949" level=info timestamp=2018-08-02T09:51:36.464160Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmifdmr9, existing: true\n" level=info timestamp=2018-08-02T09:51:36.464237Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:51:36.464261Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:51:36.464340Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmifdmr9 kind= uid=67eca88d-9639-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:51:36.464634Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmifdmr9 kind= uid=67eca88d-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifdmr9/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmifdmr9/sock" level=error timestamp=2018-08-02T09:51:36.464714Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmifdmr9 kind= uid=67eca88d-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifdmr9/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:51:36.464803Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifdmr9/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmifdmr9" level=info timestamp=2018-08-02T09:51:36.904609Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi459k8, existing: true\n" level=info timestamp=2018-08-02T09:51:36.904714Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:51:36.904742Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:51:36.904848Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi459k8 kind= uid=67e73223-9639-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:51:36.905142Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi459k8 kind= uid=67e73223-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi459k8/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi459k8/sock" level=error timestamp=2018-08-02T09:51:36.905204Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi459k8 kind= uid=67e73223-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi459k8/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:51:36.905273Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi459k8/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi459k8" Pod name: virt-launcher-testvmi459k8-bt89t Pod phase: Running level=info timestamp=2018-08-02T09:50:00.376857Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:50:00.377135Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:50:00.379041Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:50:13.293349Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:50:13.346671Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi459k8" level=info timestamp=2018-08-02T09:50:13.348462Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:50:13.348618Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmi46949-n6ztv Pod phase: Running level=info timestamp=2018-08-02T09:49:59.310110Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:49:59.316233Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:49:59.341194Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:50:12.041823Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:50:12.100559Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi46949" level=info timestamp=2018-08-02T09:50:12.109520Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:50:12.109734Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmifdmr9-v2sdd Pod phase: Running level=info timestamp=2018-08-02T09:50:00.144303Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:50:00.144571Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:50:00.146094Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:50:13.112004Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:50:13.178722Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmifdmr9" level=info timestamp=2018-08-02T09:50:13.184243Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:50:13.184456Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiktjhs-8qcsm Pod phase: Running level=info timestamp=2018-08-02T09:49:57.725071Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:49:57.725377Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:49:57.727439Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:50:07.735842Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:50:07.810107Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiktjhs" level=info timestamp=2018-08-02T09:50:07.812346Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:50:07.812510Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [181.523 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be reachable via the propagated IP from a Pod [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 on the same node from Pod /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Unexpected Warning event received: testvmi459k8,67e73223-9639-11e8-8bc5-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmi459k8/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T09:49:54.452253Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi459k8 kind=VirtualMachineInstance uid=67e73223-9639-11e8-8bc5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi459k8-bt89t" level=info timestamp=2018-08-02T09:50:15.472237Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi459k8 kind=VirtualMachineInstance uid=67e73223-9639-11e8-8bc5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi459k8-bt89t" level=error timestamp=2018-08-02T09:50:15.477212Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi459k8 kind=VirtualMachineInstance uid=67e73223-9639-11e8-8bc5-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi459k8/sock: connect: no such file or directory" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T09:52:33.441417Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:52:33 http: TLS handshake error from 10.244.1.1:35872: EOF level=info timestamp=2018-08-02T09:52:39.502971Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:52:40.244209Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:52:43 http: TLS handshake error from 10.244.1.1:35878: EOF level=info timestamp=2018-08-02T09:52:45.018458Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:52:53 http: TLS handshake error from 10.244.1.1:35884: EOF level=info timestamp=2018-08-02T09:53:03.497921Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:53:03.499133Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:53:03 http: TLS handshake error from 10.244.1.1:35890: EOF level=info timestamp=2018-08-02T09:53:09.526944Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:53:10.267064Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:53:13 http: TLS handshake error from 10.244.1.1:35896: EOF level=info timestamp=2018-08-02T09:53:14.753930Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:53:14.756196Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:49:53.839643Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi46949\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi46949" level=info timestamp=2018-08-02T09:49:53.977105Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiktjhs\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiktjhs" level=info timestamp=2018-08-02T09:49:54.576612Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifdmr9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifdmr9" level=info timestamp=2018-08-02T09:49:55.005262Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifdmr9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifdmr9" level=info timestamp=2018-08-02T09:52:55.287077Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6lvj2 kind= uid=d41979a4-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:52:55.288253Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6lvj2 kind= uid=d41979a4-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:52:55.304804Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid7fqp kind= uid=d41b0e43-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:52:55.304974Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid7fqp kind= uid=d41b0e43-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:52:55.326578Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:52:55.326712Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:52:55.358696Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqnxmn kind= uid=d421a823-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:52:55.358772Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqnxmn kind= uid=d421a823-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:52:55.375885Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid7fqp\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid7fqp" level=info timestamp=2018-08-02T09:52:55.497160Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6lvj2\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6lvj2" level=info timestamp=2018-08-02T09:52:55.910379Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid7fqp\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid7fqp" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T09:53:11.613699Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidpqjs/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmidpqjs" level=info timestamp=2018-08-02T09:53:12.256070Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmidpqjs, existing: true\n" level=info timestamp=2018-08-02T09:53:12.256152Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:53:12.256175Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:53:12.256253Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:53:12.256425Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidpqjs/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmidpqjs/sock" level=error timestamp=2018-08-02T09:53:12.256484Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidpqjs/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:53:12.256562Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidpqjs/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmidpqjs" level=info timestamp=2018-08-02T09:53:13.536778Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmidpqjs, existing: true\n" level=info timestamp=2018-08-02T09:53:13.536858Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:53:13.536881Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:53:13.537234Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:53:13.537403Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidpqjs/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmidpqjs/sock" level=error timestamp=2018-08-02T09:53:13.537487Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidpqjs/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:53:13.537568Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidpqjs/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmidpqjs" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T09:53:14.886261Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6lvj2/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi6lvj2" level=info timestamp=2018-08-02T09:53:14.907581Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi6lvj2, existing: true\n" level=info timestamp=2018-08-02T09:53:14.907637Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:53:14.907681Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:53:14.907738Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi6lvj2 kind= uid=d41979a4-9639-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:53:14.907848Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi6lvj2 kind= uid=d41979a4-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6lvj2/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi6lvj2/sock" level=error timestamp=2018-08-02T09:53:14.907896Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi6lvj2 kind= uid=d41979a4-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6lvj2/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:53:14.907962Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6lvj2/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi6lvj2" level=info timestamp=2018-08-02T09:53:14.955295Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi6lvj2, existing: true\n" level=info timestamp=2018-08-02T09:53:14.955389Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:53:14.955414Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:53:14.955495Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi6lvj2 kind= uid=d41979a4-9639-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:53:14.955947Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi6lvj2 kind= uid=d41979a4-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6lvj2/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi6lvj2/sock" level=error timestamp=2018-08-02T09:53:14.956033Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi6lvj2 kind= uid=d41979a4-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6lvj2/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:53:14.956109Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6lvj2/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi6lvj2" Pod name: virt-launcher-testvmi6lvj2-c96tn Pod phase: Running level=info timestamp=2018-08-02T09:53:02.170027Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:53:02.170346Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:53:02.171612Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:53:13.600257Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:53:13.651469Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi6lvj2" level=info timestamp=2018-08-02T09:53:13.653138Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:53:13.653404Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmid7fqp-gr24m Pod phase: Running level=info timestamp=2018-08-02T09:53:01.866599Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:53:01.866906Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:53:01.881166Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" Pod name: virt-launcher-testvmidpqjs-mrmbg Pod phase: Running level=info timestamp=2018-08-02T09:52:59.252027Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:52:59.252343Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:52:59.254066Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:53:09.265250Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:53:09.298171Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmidpqjs" level=info timestamp=2018-08-02T09:53:09.309439Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:53:09.309709Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiqnxmn-tvb6b Pod phase: Running level=info timestamp=2018-08-02T09:53:01.981404Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:53:01.981671Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:53:01.982829Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:53:14.934297Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:53:14.987938Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiqnxmn" level=info timestamp=2018-08-02T09:53:14.989339Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:53:14.989506Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T09:55:03.694221Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:55:03 http: TLS handshake error from 10.244.1.1:35962: EOF level=info timestamp=2018-08-02T09:55:09.451520Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:55:10.357260Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:55:13 http: TLS handshake error from 10.244.1.1:35968: EOF level=info timestamp=2018-08-02T09:55:15.285766Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:55:23 http: TLS handshake error from 10.244.1.1:35974: EOF level=info timestamp=2018-08-02T09:55:33.737087Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:55:33.749904Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:55:33 http: TLS handshake error from 10.244.1.1:35980: EOF level=info timestamp=2018-08-02T09:55:39.462217Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:55:40.379053Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:55:43 http: TLS handshake error from 10.244.1.1:35986: EOF level=info timestamp=2018-08-02T09:55:45.340063Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:55:53 http: TLS handshake error from 10.244.1.1:35992: EOF Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:49:53.839643Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi46949\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi46949" level=info timestamp=2018-08-02T09:49:53.977105Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiktjhs\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiktjhs" level=info timestamp=2018-08-02T09:49:54.576612Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifdmr9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifdmr9" level=info timestamp=2018-08-02T09:49:55.005262Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifdmr9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifdmr9" level=info timestamp=2018-08-02T09:52:55.287077Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6lvj2 kind= uid=d41979a4-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:52:55.288253Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6lvj2 kind= uid=d41979a4-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:52:55.304804Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid7fqp kind= uid=d41b0e43-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:52:55.304974Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid7fqp kind= uid=d41b0e43-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:52:55.326578Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:52:55.326712Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:52:55.358696Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqnxmn kind= uid=d421a823-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:52:55.358772Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqnxmn kind= uid=d421a823-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:52:55.375885Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid7fqp\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid7fqp" level=info timestamp=2018-08-02T09:52:55.497160Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6lvj2\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6lvj2" level=info timestamp=2018-08-02T09:52:55.910379Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid7fqp\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid7fqp" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T09:54:32.904967Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:54:32.905447Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidpqjs/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmidpqjs/sock" level=error timestamp=2018-08-02T09:54:32.905555Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidpqjs/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:54:32.905646Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidpqjs/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmidpqjs" level=info timestamp=2018-08-02T09:55:37.081076Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiktjhs, existing: false\n" level=info timestamp=2018-08-02T09:55:37.081162Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:55:37.081264Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiktjhs kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:55:37.081367Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiktjhs kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:55:54.826162Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmidpqjs, existing: true\n" level=info timestamp=2018-08-02T09:55:54.826245Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:55:54.826276Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:55:54.826413Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:55:54.826772Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidpqjs/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmidpqjs/sock" level=error timestamp=2018-08-02T09:55:54.826883Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidpqjs/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:55:54.827568Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidpqjs/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmidpqjs" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T09:54:36.806428Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6lvj2/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi6lvj2" level=info timestamp=2018-08-02T09:54:37.381503Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiqnxmn, existing: true\n" level=info timestamp=2018-08-02T09:54:37.381601Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:54:37.381633Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:54:37.381990Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiqnxmn kind= uid=d421a823-9639-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:54:37.382972Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiqnxmn kind= uid=d421a823-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqnxmn/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiqnxmn/sock" level=error timestamp=2018-08-02T09:54:37.383106Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiqnxmn kind= uid=d421a823-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqnxmn/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:54:37.383197Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqnxmn/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqnxmn" level=info timestamp=2018-08-02T09:54:37.715722Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmid7fqp, existing: true\n" level=info timestamp=2018-08-02T09:54:37.715820Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:54:37.715851Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:54:37.715942Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmid7fqp kind= uid=d41b0e43-9639-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:54:37.716260Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmid7fqp kind= uid=d41b0e43-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid7fqp/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmid7fqp/sock" level=error timestamp=2018-08-02T09:54:37.716332Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmid7fqp kind= uid=d41b0e43-9639-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid7fqp/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:54:37.716538Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid7fqp/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmid7fqp" Pod name: virt-launcher-testvmi6lvj2-c96tn Pod phase: Running level=info timestamp=2018-08-02T09:53:02.170027Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:53:02.170346Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:53:02.171612Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:53:13.600257Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:53:13.651469Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi6lvj2" level=info timestamp=2018-08-02T09:53:13.653138Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:53:13.653404Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmid7fqp-gr24m Pod phase: Running level=info timestamp=2018-08-02T09:53:01.866599Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:53:01.866906Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:53:01.881166Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:53:15.360580Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:53:15.383562Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmid7fqp" level=info timestamp=2018-08-02T09:53:15.385628Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:53:15.385795Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmidpqjs-mrmbg Pod phase: Running level=info timestamp=2018-08-02T09:52:59.252027Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:52:59.252343Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:52:59.254066Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:53:09.265250Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:53:09.298171Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmidpqjs" level=info timestamp=2018-08-02T09:53:09.309439Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:53:09.309709Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiqnxmn-tvb6b Pod phase: Running level=info timestamp=2018-08-02T09:53:01.981404Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:53:01.981671Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:53:01.982829Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:53:14.934297Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:53:14.987938Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiqnxmn" level=info timestamp=2018-08-02T09:53:14.989339Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:53:14.989506Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [181.556 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be reachable via the propagated IP from a Pod [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 on a different node from Pod /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Unexpected Warning event received: testvmi6lvj2,d41979a4-9639-11e8-8bc5-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6lvj2/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T09:52:55.968677Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi6lvj2 kind=VirtualMachineInstance uid=d41979a4-9639-11e8-8bc5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi6lvj2-c96tn" level=info timestamp=2018-08-02T09:53:15.355431Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi6lvj2 kind=VirtualMachineInstance uid=d41979a4-9639-11e8-8bc5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi6lvj2-c96tn" level=error timestamp=2018-08-02T09:53:15.362147Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi6lvj2 kind=VirtualMachineInstance uid=d41979a4-9639-11e8-8bc5-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi6lvj2/sock: connect: no such file or directory" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T09:55:33.749904Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:55:33 http: TLS handshake error from 10.244.1.1:35980: EOF level=info timestamp=2018-08-02T09:55:39.462217Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:55:40.379053Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:55:43 http: TLS handshake error from 10.244.1.1:35986: EOF level=info timestamp=2018-08-02T09:55:45.340063Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:55:53 http: TLS handshake error from 10.244.1.1:35992: EOF 2018/08/02 09:56:03 http: TLS handshake error from 10.244.1.1:35998: EOF level=info timestamp=2018-08-02T09:56:03.813513Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:56:03.818205Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:56:09.457510Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:56:10.405803Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:56:13 http: TLS handshake error from 10.244.1.1:36004: EOF level=info timestamp=2018-08-02T09:56:14.761965Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:56:14.769034Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:52:55.326578Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:52:55.326712Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:52:55.358696Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqnxmn kind= uid=d421a823-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:52:55.358772Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqnxmn kind= uid=d421a823-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:52:55.375885Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid7fqp\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid7fqp" level=info timestamp=2018-08-02T09:52:55.497160Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6lvj2\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6lvj2" level=info timestamp=2018-08-02T09:52:55.910379Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid7fqp\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid7fqp" level=info timestamp=2018-08-02T09:55:56.843601Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi66xsd kind= uid=4050b62c-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:55:56.844870Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi66xsd kind= uid=4050b62c-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:55:56.851461Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:55:56.851634Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:55:56.862550Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin2nc6 kind= uid=40538dbe-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:55:56.862637Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin2nc6 kind= uid=40538dbe-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:55:56.872429Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibnrcn kind= uid=405538a1-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:55:56.872498Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibnrcn kind= uid=405538a1-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T09:56:11.345516Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi47txh/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi47txh" level=info timestamp=2018-08-02T09:56:11.985729Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi47txh, existing: true\n" level=info timestamp=2018-08-02T09:56:11.985814Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:56:11.985848Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:56:11.985974Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:56:11.986294Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi47txh/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi47txh/sock" level=error timestamp=2018-08-02T09:56:11.986400Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi47txh/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:56:11.986524Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi47txh/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi47txh" level=info timestamp=2018-08-02T09:56:13.267193Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi47txh, existing: true\n" level=info timestamp=2018-08-02T09:56:13.267284Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:56:13.267314Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:56:13.267408Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:56:13.267737Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi47txh/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi47txh/sock" level=error timestamp=2018-08-02T09:56:13.267807Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi47txh/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:56:13.267899Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi47txh/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi47txh" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T09:56:15.005364Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi66xsd/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi66xsd" level=info timestamp=2018-08-02T09:56:15.045739Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi66xsd, existing: true\n" level=info timestamp=2018-08-02T09:56:15.045775Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:56:15.045811Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:56:15.045868Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi66xsd kind= uid=4050b62c-963a-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:56:15.045986Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi66xsd kind= uid=4050b62c-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi66xsd/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi66xsd/sock" level=error timestamp=2018-08-02T09:56:15.046028Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi66xsd kind= uid=4050b62c-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi66xsd/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:56:15.046090Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi66xsd/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi66xsd" level=info timestamp=2018-08-02T09:56:15.128918Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi66xsd, existing: true\n" level=info timestamp=2018-08-02T09:56:15.128995Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:56:15.129018Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:56:15.129097Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi66xsd kind= uid=4050b62c-963a-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:56:15.129266Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi66xsd kind= uid=4050b62c-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi66xsd/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi66xsd/sock" level=error timestamp=2018-08-02T09:56:15.129314Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi66xsd kind= uid=4050b62c-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi66xsd/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:56:15.129388Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi66xsd/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi66xsd" Pod name: virt-launcher-testvmi47txh-bgxhs Pod phase: Running level=info timestamp=2018-08-02T09:56:00.234380Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:56:00.235009Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:56:00.236995Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:56:10.243858Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:56:10.286058Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi47txh" level=info timestamp=2018-08-02T09:56:10.287823Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:56:10.288019Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmi66xsd-xnkt4 Pod phase: Running level=info timestamp=2018-08-02T09:56:03.084245Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:56:03.084549Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:56:03.085861Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:56:13.947457Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:56:14.022914Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi66xsd" level=info timestamp=2018-08-02T09:56:14.024408Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:56:14.024570Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmibnrcn-hgw8w Pod phase: Running level=info timestamp=2018-08-02T09:56:03.537093Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:56:03.537587Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:56:03.543566Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" Pod name: virt-launcher-testvmin2nc6-vvlsr Pod phase: Running level=info timestamp=2018-08-02T09:56:01.668346Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:56:01.698212Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:56:01.700409Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T09:58:09.539598Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:58:10.488160Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:58:13 http: TLS handshake error from 10.244.1.1:36076: EOF level=info timestamp=2018-08-02T09:58:14.545979Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:58:14.548038Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:58:15.612878Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:58:23 http: TLS handshake error from 10.244.1.1:36082: EOF 2018/08/02 09:58:33 http: TLS handshake error from 10.244.1.1:36088: EOF level=info timestamp=2018-08-02T09:58:34.064866Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:58:34.071946Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:58:39.455521Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:58:40.510993Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:58:43 http: TLS handshake error from 10.244.1.1:36094: EOF level=info timestamp=2018-08-02T09:58:45.666493Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:58:53 http: TLS handshake error from 10.244.1.1:36100: EOF Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:52:55.326578Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:52:55.326712Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidpqjs kind= uid=d41f49d9-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:52:55.358696Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqnxmn kind= uid=d421a823-9639-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:52:55.358772Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqnxmn kind= uid=d421a823-9639-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:52:55.375885Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid7fqp\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid7fqp" level=info timestamp=2018-08-02T09:52:55.497160Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6lvj2\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6lvj2" level=info timestamp=2018-08-02T09:52:55.910379Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid7fqp\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid7fqp" level=info timestamp=2018-08-02T09:55:56.843601Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi66xsd kind= uid=4050b62c-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:55:56.844870Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi66xsd kind= uid=4050b62c-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:55:56.851461Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:55:56.851634Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:55:56.862550Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin2nc6 kind= uid=40538dbe-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:55:56.862637Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin2nc6 kind= uid=40538dbe-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:55:56.872429Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibnrcn kind= uid=405538a1-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:55:56.872498Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibnrcn kind= uid=405538a1-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T09:57:32.633448Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:57:32.633727Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi47txh/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi47txh/sock" level=error timestamp=2018-08-02T09:57:32.633795Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi47txh/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:57:32.633890Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi47txh/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi47txh" level=info timestamp=2018-08-02T09:58:38.668852Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmidpqjs, existing: false\n" level=info timestamp=2018-08-02T09:58:38.668987Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:38.669089Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmidpqjs kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:38.669223Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmidpqjs kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:54.556148Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi47txh, existing: true\n" level=info timestamp=2018-08-02T09:58:54.556231Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:58:54.556256Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:54.556334Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:58:54.556631Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi47txh/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi47txh/sock" level=error timestamp=2018-08-02T09:58:54.556701Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi47txh/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:58:54.556777Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi47txh/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi47txh" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T09:57:36.904194Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi66xsd/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi66xsd" level=info timestamp=2018-08-02T09:57:37.723320Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmin2nc6, existing: true\n" level=info timestamp=2018-08-02T09:57:37.723412Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:57:37.723437Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:37.723737Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmin2nc6 kind= uid=40538dbe-963a-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:57:37.724046Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmin2nc6 kind= uid=40538dbe-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmin2nc6/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmin2nc6/sock" level=error timestamp=2018-08-02T09:57:37.724133Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmin2nc6 kind= uid=40538dbe-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmin2nc6/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:57:37.724207Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmin2nc6/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmin2nc6" level=info timestamp=2018-08-02T09:57:38.113537Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmibnrcn, existing: true\n" level=info timestamp=2018-08-02T09:57:38.113612Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:57:38.113635Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:38.113728Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmibnrcn kind= uid=405538a1-963a-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:57:38.113966Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmibnrcn kind= uid=405538a1-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmibnrcn/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmibnrcn/sock" level=error timestamp=2018-08-02T09:57:38.114027Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmibnrcn kind= uid=405538a1-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmibnrcn/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:57:38.114103Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmibnrcn/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmibnrcn" Pod name: virt-launcher-testvmi47txh-bgxhs Pod phase: Running level=info timestamp=2018-08-02T09:56:00.234380Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:56:00.235009Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:56:00.236995Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:56:10.243858Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:56:10.286058Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi47txh" level=info timestamp=2018-08-02T09:56:10.287823Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:56:10.288019Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmi66xsd-xnkt4 Pod phase: Running level=info timestamp=2018-08-02T09:56:03.084245Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:56:03.084549Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:56:03.085861Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:56:13.947457Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:56:14.022914Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi66xsd" level=info timestamp=2018-08-02T09:56:14.024408Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:56:14.024570Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmibnrcn-hgw8w Pod phase: Running level=info timestamp=2018-08-02T09:56:03.537093Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:56:03.537587Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:56:03.543566Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:56:15.946872Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:56:15.986330Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmibnrcn" level=info timestamp=2018-08-02T09:56:15.988271Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:56:15.988436Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmin2nc6-vvlsr Pod phase: Running level=info timestamp=2018-08-02T09:56:01.668346Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:56:01.698212Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:56:01.700409Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:56:15.675784Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:56:15.735247Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmin2nc6" level=info timestamp=2018-08-02T09:56:15.740378Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:56:15.740535Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [181.483 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be reachable via the propagated IP from a Pod [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 on the same node from Node /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Unexpected Warning event received: testvmi66xsd,4050b62c-963a-11e8-8bc5-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmi66xsd/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T09:55:57.495352Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi66xsd kind=VirtualMachineInstance uid=4050b62c-963a-11e8-8bc5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi66xsd-xnkt4" level=info timestamp=2018-08-02T09:56:15.450395Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi66xsd kind=VirtualMachineInstance uid=4050b62c-963a-11e8-8bc5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi66xsd-xnkt4" level=error timestamp=2018-08-02T09:56:15.465616Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi66xsd kind=VirtualMachineInstance uid=4050b62c-963a-11e8-8bc5-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi66xsd/sock: connect: no such file or directory" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T09:58:34.071946Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:58:39.455521Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:58:40.510993Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:58:43 http: TLS handshake error from 10.244.1.1:36094: EOF level=info timestamp=2018-08-02T09:58:45.666493Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:58:53 http: TLS handshake error from 10.244.1.1:36100: EOF 2018/08/02 09:59:03 http: TLS handshake error from 10.244.1.1:36106: EOF level=info timestamp=2018-08-02T09:59:04.118692Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:59:04.156396Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:59:09.457565Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:59:10.532991Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:59:13 http: TLS handshake error from 10.244.1.1:36112: EOF level=info timestamp=2018-08-02T09:59:14.552124Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:59:14.560911Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:59:15.721297Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:55:56.851634Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:55:56.862550Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin2nc6 kind= uid=40538dbe-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:55:56.862637Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin2nc6 kind= uid=40538dbe-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:55:56.872429Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibnrcn kind= uid=405538a1-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:55:56.872498Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibnrcn kind= uid=405538a1-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:58:58.327875Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizzrlj kind= uid=ac7d661e-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:58:58.329012Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizzrlj kind= uid=ac7d661e-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:58:58.338940Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigk6vr kind= uid=ac7ee63b-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:58:58.339023Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigk6vr kind= uid=ac7ee63b-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:58:58.352519Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmintdwb kind= uid=ac80c705-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:58:58.352565Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmintdwb kind= uid=ac80c705-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:58:58.369278Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidz6jv kind= uid=ac835606-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:58:58.369366Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidz6jv kind= uid=ac835606-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:58:58.404956Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizzrlj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizzrlj" level=info timestamp=2018-08-02T09:58:58.532726Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmintdwb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmintdwb" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T09:59:14.296452Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmintdwb/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmintdwb" level=info timestamp=2018-08-02T09:59:15.576726Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmintdwb, existing: true\n" level=info timestamp=2018-08-02T09:59:15.576805Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:59:15.576827Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:59:15.577018Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmintdwb kind= uid=ac80c705-963a-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:59:15.577235Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmintdwb kind= uid=ac80c705-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmintdwb/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmintdwb/sock" level=error timestamp=2018-08-02T09:59:15.577305Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmintdwb kind= uid=ac80c705-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmintdwb/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:59:15.577383Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmintdwb/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmintdwb" level=info timestamp=2018-08-02T09:59:18.137571Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmintdwb, existing: true\n" level=info timestamp=2018-08-02T09:59:18.137667Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:59:18.137697Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:59:18.137779Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmintdwb kind= uid=ac80c705-963a-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:59:18.138142Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmintdwb kind= uid=ac80c705-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmintdwb/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmintdwb/sock" level=error timestamp=2018-08-02T09:59:18.138224Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmintdwb kind= uid=ac80c705-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmintdwb/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:59:18.138318Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmintdwb/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmintdwb" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T09:59:18.559599Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmizzrlj/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmizzrlj" level=info timestamp=2018-08-02T09:59:18.579765Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmizzrlj, existing: true\n" level=info timestamp=2018-08-02T09:59:18.579819Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:59:18.579858Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:59:18.579922Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmizzrlj kind= uid=ac7d661e-963a-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:59:18.580020Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmizzrlj kind= uid=ac7d661e-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmizzrlj/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmizzrlj/sock" level=error timestamp=2018-08-02T09:59:18.580061Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmizzrlj kind= uid=ac7d661e-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmizzrlj/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:59:18.580128Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmizzrlj/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmizzrlj" level=info timestamp=2018-08-02T09:59:18.620328Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmizzrlj, existing: true\n" level=info timestamp=2018-08-02T09:59:18.620427Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:59:18.620453Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:59:18.620558Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmizzrlj kind= uid=ac7d661e-963a-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:59:18.620729Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmizzrlj kind= uid=ac7d661e-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmizzrlj/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmizzrlj/sock" level=error timestamp=2018-08-02T09:59:18.620789Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmizzrlj kind= uid=ac7d661e-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmizzrlj/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:59:18.620865Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmizzrlj/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmizzrlj" Pod name: virt-launcher-testvmidz6jv-5lbzj Pod phase: Running level=info timestamp=2018-08-02T09:59:04.880044Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:59:04.880325Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:59:04.887725Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:59:16.869339Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:59:16.914939Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmidz6jv" level=info timestamp=2018-08-02T09:59:16.917078Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:59:16.917246Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmigk6vr-w9ffm Pod phase: Running level=info timestamp=2018-08-02T09:59:02.156490Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:59:02.157279Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:59:02.158965Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:59:14.630333Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:59:14.751850Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmigk6vr" level=info timestamp=2018-08-02T09:59:14.754157Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:59:14.754336Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmintdwb-6wkjz Pod phase: Running level=info timestamp=2018-08-02T09:59:01.855965Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:59:01.856233Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:59:01.857446Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:59:11.864674Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:59:11.911401Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmintdwb" level=info timestamp=2018-08-02T09:59:11.914493Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:59:11.914689Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmizzrlj-hpxmm Pod phase: Running level=info timestamp=2018-08-02T09:59:04.768708Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:59:04.768948Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:59:04.770374Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:59:16.679500Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:59:16.773984Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmizzrlj" level=info timestamp=2018-08-02T09:59:16.775920Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:59:16.776142Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T10:01:09.461720Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:01:10.628076Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:01:13 http: TLS handshake error from 10.244.1.1:36184: EOF level=info timestamp=2018-08-02T10:01:14.562617Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:01:14.563699Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:01:15.949416Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:01:23 http: TLS handshake error from 10.244.1.1:36190: EOF 2018/08/02 10:01:33 http: TLS handshake error from 10.244.1.1:36196: EOF level=info timestamp=2018-08-02T10:01:34.383221Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:01:34.394960Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:01:39.475075Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:01:40.651056Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:01:43 http: TLS handshake error from 10.244.1.1:36202: EOF level=info timestamp=2018-08-02T10:01:46.007070Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:01:53 http: TLS handshake error from 10.244.1.1:36208: EOF Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:55:56.851634Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi47txh kind= uid=405266ea-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:55:56.862550Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin2nc6 kind= uid=40538dbe-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:55:56.862637Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmin2nc6 kind= uid=40538dbe-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:55:56.872429Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibnrcn kind= uid=405538a1-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:55:56.872498Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibnrcn kind= uid=405538a1-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:58:58.327875Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizzrlj kind= uid=ac7d661e-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:58:58.329012Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizzrlj kind= uid=ac7d661e-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:58:58.338940Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigk6vr kind= uid=ac7ee63b-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:58:58.339023Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigk6vr kind= uid=ac7ee63b-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:58:58.352519Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmintdwb kind= uid=ac80c705-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:58:58.352565Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmintdwb kind= uid=ac80c705-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:58:58.369278Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidz6jv kind= uid=ac835606-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:58:58.369366Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidz6jv kind= uid=ac835606-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:58:58.404956Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizzrlj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizzrlj" level=info timestamp=2018-08-02T09:58:58.532726Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmintdwb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmintdwb" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T10:00:34.942482Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmintdwb kind= uid=ac80c705-963a-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:00:34.942770Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmintdwb kind= uid=ac80c705-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmintdwb/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmintdwb/sock" level=error timestamp=2018-08-02T10:00:34.942837Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmintdwb kind= uid=ac80c705-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmintdwb/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:00:34.942929Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmintdwb/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmintdwb" level=info timestamp=2018-08-02T10:01:38.397106Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi47txh, existing: false\n" level=info timestamp=2018-08-02T10:01:38.397184Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:01:38.397264Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi47txh kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:01:38.397341Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi47txh kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:01:56.866292Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmintdwb, existing: true\n" level=info timestamp=2018-08-02T10:01:56.866373Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:01:56.866397Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:01:56.866478Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmintdwb kind= uid=ac80c705-963a-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:01:56.866650Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmintdwb kind= uid=ac80c705-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmintdwb/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmintdwb/sock" level=error timestamp=2018-08-02T10:01:56.866715Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmintdwb kind= uid=ac80c705-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmintdwb/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:01:56.866781Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmintdwb/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmintdwb" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T10:00:37.165090Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmigk6vr/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmigk6vr" level=info timestamp=2018-08-02T10:00:39.760821Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmidz6jv, existing: true\n" level=info timestamp=2018-08-02T10:00:39.760903Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:00:39.760927Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:00:39.761003Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmidz6jv kind= uid=ac835606-963a-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:00:39.761242Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmidz6jv kind= uid=ac835606-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidz6jv/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmidz6jv/sock" level=error timestamp=2018-08-02T10:00:39.761302Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmidz6jv kind= uid=ac835606-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidz6jv/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:00:39.761705Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmidz6jv/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmidz6jv" level=info timestamp=2018-08-02T10:00:40.469830Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmizzrlj, existing: true\n" level=info timestamp=2018-08-02T10:00:40.469924Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:00:40.469955Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:00:40.470046Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmizzrlj kind= uid=ac7d661e-963a-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:00:40.470392Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmizzrlj kind= uid=ac7d661e-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmizzrlj/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmizzrlj/sock" level=error timestamp=2018-08-02T10:00:40.470467Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmizzrlj kind= uid=ac7d661e-963a-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmizzrlj/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:00:40.470550Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmizzrlj/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmizzrlj" Pod name: virt-launcher-testvmidz6jv-5lbzj Pod phase: Running level=info timestamp=2018-08-02T09:59:04.880044Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:59:04.880325Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:59:04.887725Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:59:16.869339Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:59:16.914939Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmidz6jv" level=info timestamp=2018-08-02T09:59:16.917078Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:59:16.917246Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmigk6vr-w9ffm Pod phase: Running level=info timestamp=2018-08-02T09:59:02.156490Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:59:02.157279Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:59:02.158965Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:59:14.630333Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:59:14.751850Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmigk6vr" level=info timestamp=2018-08-02T09:59:14.754157Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:59:14.754336Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmintdwb-6wkjz Pod phase: Running level=info timestamp=2018-08-02T09:59:01.855965Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:59:01.856233Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:59:01.857446Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:59:11.864674Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:59:11.911401Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmintdwb" level=info timestamp=2018-08-02T09:59:11.914493Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:59:11.914689Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmizzrlj-hpxmm Pod phase: Running level=info timestamp=2018-08-02T09:59:04.768708Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:59:04.768948Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:59:04.770374Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:59:16.679500Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:59:16.773984Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmizzrlj" level=info timestamp=2018-08-02T09:59:16.775920Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:59:16.776142Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [181.526 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be reachable via the propagated IP from a Pod [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 on a different node from Node /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Unexpected Warning event received: testvmizzrlj,ac7d661e-963a-11e8-8bc5-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmizzrlj/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T09:58:58.928626Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmizzrlj kind=VirtualMachineInstance uid=ac7d661e-963a-11e8-8bc5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmizzrlj-hpxmm" level=info timestamp=2018-08-02T09:59:19.033053Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmizzrlj kind=VirtualMachineInstance uid=ac7d661e-963a-11e8-8bc5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmizzrlj-hpxmm" level=error timestamp=2018-08-02T09:59:19.043645Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmizzrlj kind=VirtualMachineInstance uid=ac7d661e-963a-11e8-8bc5-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmizzrlj/sock: connect: no such file or directory" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running 2018/08/02 10:01:33 http: TLS handshake error from 10.244.1.1:36196: EOF level=info timestamp=2018-08-02T10:01:34.383221Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:01:34.394960Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:01:39.475075Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:01:40.651056Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:01:43 http: TLS handshake error from 10.244.1.1:36202: EOF level=info timestamp=2018-08-02T10:01:46.007070Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:01:53 http: TLS handshake error from 10.244.1.1:36208: EOF 2018/08/02 10:02:03 http: TLS handshake error from 10.244.1.1:36214: EOF level=info timestamp=2018-08-02T10:02:04.436497Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:02:04.457863Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:02:09.476644Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:02:10.674500Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:02:13 http: TLS handshake error from 10.244.1.1:36220: EOF level=info timestamp=2018-08-02T10:02:16.062405Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:58:58.369278Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidz6jv kind= uid=ac835606-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:58:58.369366Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidz6jv kind= uid=ac835606-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:58:58.404956Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizzrlj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizzrlj" level=info timestamp=2018-08-02T09:58:58.532726Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmintdwb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmintdwb" level=info timestamp=2018-08-02T10:01:59.859523Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipfhc7 kind= uid=18b09079-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:01:59.860870Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipfhc7 kind= uid=18b09079-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:01:59.868436Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicrcrg kind= uid=18b297e0-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:01:59.868568Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicrcrg kind= uid=18b297e0-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:01:59.880676Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiht5hn kind= uid=18b3ce05-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:01:59.880784Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiht5hn kind= uid=18b3ce05-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:01:59.889746Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7m4pb kind= uid=18b573ec-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:01:59.889798Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7m4pb kind= uid=18b573ec-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:01:59.960562Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipfhc7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipfhc7" level=info timestamp=2018-08-02T10:01:59.961031Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiht5hn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiht5hn" level=info timestamp=2018-08-02T10:02:00.719021Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipfhc7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipfhc7" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T10:02:15.515655Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmicrcrg/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmicrcrg" level=info timestamp=2018-08-02T10:02:16.155939Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmicrcrg, existing: true\n" level=info timestamp=2018-08-02T10:02:16.156025Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:02:16.156055Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:02:16.156155Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmicrcrg kind= uid=18b297e0-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:02:16.156499Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmicrcrg kind= uid=18b297e0-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmicrcrg/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmicrcrg/sock" level=error timestamp=2018-08-02T10:02:16.156631Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmicrcrg kind= uid=18b297e0-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmicrcrg/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:02:16.156759Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmicrcrg/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmicrcrg" level=info timestamp=2018-08-02T10:02:17.437018Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmicrcrg, existing: true\n" level=info timestamp=2018-08-02T10:02:17.437117Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:02:17.437159Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:02:17.437282Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmicrcrg kind= uid=18b297e0-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:02:17.437478Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmicrcrg kind= uid=18b297e0-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmicrcrg/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmicrcrg/sock" level=error timestamp=2018-08-02T10:02:17.437584Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmicrcrg kind= uid=18b297e0-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmicrcrg/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:02:17.437688Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmicrcrg/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmicrcrg" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T10:02:18.686202Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipfhc7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmipfhc7" level=info timestamp=2018-08-02T10:02:18.710098Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipfhc7, existing: true\n" level=info timestamp=2018-08-02T10:02:18.710157Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:02:18.710180Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:02:18.710237Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmipfhc7 kind= uid=18b09079-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:02:18.710489Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmipfhc7 kind= uid=18b09079-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipfhc7/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmipfhc7/sock" level=error timestamp=2018-08-02T10:02:18.710531Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmipfhc7 kind= uid=18b09079-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipfhc7/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:02:18.710602Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipfhc7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmipfhc7" level=info timestamp=2018-08-02T10:02:18.753417Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipfhc7, existing: true\n" level=info timestamp=2018-08-02T10:02:18.753500Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:02:18.753525Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:02:18.753628Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmipfhc7 kind= uid=18b09079-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:02:18.753998Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmipfhc7 kind= uid=18b09079-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipfhc7/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmipfhc7/sock" level=error timestamp=2018-08-02T10:02:18.754061Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmipfhc7 kind= uid=18b09079-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipfhc7/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:02:18.754129Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipfhc7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmipfhc7" Pod name: virt-launcher-testvmi7m4pb-584jb Pod phase: Running level=info timestamp=2018-08-02T10:02:06.275436Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:02:06.276234Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:02:06.283150Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:02:18.636881Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:02:18.675270Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi7m4pb" level=info timestamp=2018-08-02T10:02:18.679091Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:02:18.679252Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmicrcrg-7w2lv Pod phase: Running level=info timestamp=2018-08-02T10:02:03.071550Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:02:03.071848Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:02:03.073555Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:02:13.080619Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:02:13.107463Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmicrcrg" level=info timestamp=2018-08-02T10:02:13.109328Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:02:13.109502Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiht5hn-2kb6l Pod phase: Running level=info timestamp=2018-08-02T10:02:06.280234Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:02:06.280634Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:02:06.286248Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:02:18.764626Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:02:18.812967Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiht5hn" level=info timestamp=2018-08-02T10:02:18.814916Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:02:18.815067Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmipfhc7-rzsgv Pod phase: Running level=info timestamp=2018-08-02T10:02:05.292749Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:02:05.299025Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:02:05.300433Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:02:17.581970Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:02:17.670903Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmipfhc7" level=info timestamp=2018-08-02T10:02:17.672408Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:02:17.672550Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T10:04:09.481015Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:04:10.817158Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:04:13 http: TLS handshake error from 10.244.1.1:36292: EOF level=info timestamp=2018-08-02T10:04:14.628693Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:04:14.630513Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:04:16.370272Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:04:23 http: TLS handshake error from 10.244.1.1:36298: EOF 2018/08/02 10:04:33 http: TLS handshake error from 10.244.1.1:36304: EOF level=info timestamp=2018-08-02T10:04:34.701300Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:04:34.725758Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:04:39.454267Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:04:40.837396Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:04:43 http: TLS handshake error from 10.244.1.1:36310: EOF level=info timestamp=2018-08-02T10:04:46.423784Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:04:53 http: TLS handshake error from 10.244.1.1:36316: EOF Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T09:58:58.369278Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidz6jv kind= uid=ac835606-963a-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:58:58.369366Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidz6jv kind= uid=ac835606-963a-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:58:58.404956Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizzrlj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizzrlj" level=info timestamp=2018-08-02T09:58:58.532726Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmintdwb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmintdwb" level=info timestamp=2018-08-02T10:01:59.859523Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipfhc7 kind= uid=18b09079-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:01:59.860870Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipfhc7 kind= uid=18b09079-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:01:59.868436Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicrcrg kind= uid=18b297e0-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:01:59.868568Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicrcrg kind= uid=18b297e0-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:01:59.880676Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiht5hn kind= uid=18b3ce05-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:01:59.880784Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiht5hn kind= uid=18b3ce05-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:01:59.889746Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7m4pb kind= uid=18b573ec-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:01:59.889798Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7m4pb kind= uid=18b573ec-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:01:59.960562Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipfhc7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipfhc7" level=info timestamp=2018-08-02T10:01:59.961031Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiht5hn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiht5hn" level=info timestamp=2018-08-02T10:02:00.719021Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipfhc7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipfhc7" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T10:03:36.802168Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmicrcrg kind= uid=18b297e0-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:03:36.802477Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmicrcrg kind= uid=18b297e0-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmicrcrg/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmicrcrg/sock" level=error timestamp=2018-08-02T10:03:36.802541Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmicrcrg kind= uid=18b297e0-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmicrcrg/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:03:36.802604Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmicrcrg/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmicrcrg" level=info timestamp=2018-08-02T10:04:40.707807Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmintdwb, existing: false\n" level=info timestamp=2018-08-02T10:04:40.708091Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:04:40.708268Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmintdwb kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:04:40.708345Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmintdwb kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:04:58.727779Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmicrcrg, existing: true\n" level=info timestamp=2018-08-02T10:04:58.727864Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:04:58.727888Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:04:58.728032Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmicrcrg kind= uid=18b297e0-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:04:58.728316Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmicrcrg kind= uid=18b297e0-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmicrcrg/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmicrcrg/sock" level=error timestamp=2018-08-02T10:04:58.728512Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmicrcrg kind= uid=18b297e0-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmicrcrg/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:04:58.728589Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmicrcrg/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmicrcrg" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T10:03:41.721987Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi7m4pb kind= uid=18b573ec-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:03:41.722263Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi7m4pb kind= uid=18b573ec-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7m4pb/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi7m4pb/sock" level=error timestamp=2018-08-02T10:03:41.722321Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi7m4pb kind= uid=18b573ec-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7m4pb/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:03:41.722396Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7m4pb/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi7m4pb" level=info timestamp=2018-08-02T10:03:41.844986Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiht5hn, existing: true\n" level=info timestamp=2018-08-02T10:03:41.845083Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:03:41.845109Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:03:41.845185Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiht5hn kind= uid=18b3ce05-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:03:41.845416Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiht5hn kind= uid=18b3ce05-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiht5hn/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiht5hn/sock" level=error timestamp=2018-08-02T10:03:41.845501Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiht5hn kind= uid=18b3ce05-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiht5hn/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:03:41.845587Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiht5hn/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiht5hn" level=info timestamp=2018-08-02T10:04:42.926292Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmigk6vr, existing: false\n" level=info timestamp=2018-08-02T10:04:42.926368Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:04:42.926435Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmigk6vr kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:04:42.926526Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmigk6vr kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi7m4pb-584jb Pod phase: Running level=info timestamp=2018-08-02T10:02:06.275436Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:02:06.276234Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:02:06.283150Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:02:18.636881Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:02:18.675270Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi7m4pb" level=info timestamp=2018-08-02T10:02:18.679091Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:02:18.679252Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmicrcrg-7w2lv Pod phase: Running level=info timestamp=2018-08-02T10:02:03.071550Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:02:03.071848Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:02:03.073555Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:02:13.080619Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:02:13.107463Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmicrcrg" level=info timestamp=2018-08-02T10:02:13.109328Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:02:13.109502Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiht5hn-2kb6l Pod phase: Running level=info timestamp=2018-08-02T10:02:06.280234Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:02:06.280634Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:02:06.286248Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:02:18.764626Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:02:18.812967Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiht5hn" level=info timestamp=2018-08-02T10:02:18.814916Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:02:18.815067Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmipfhc7-rzsgv Pod phase: Running level=info timestamp=2018-08-02T10:02:05.292749Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:02:05.299025Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:02:05.300433Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:02:17.581970Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:02:17.670903Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmipfhc7" level=info timestamp=2018-08-02T10:02:17.672408Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:02:17.672550Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [181.495 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 with a service matching the vmi exposed [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:272 should be able to reach the vmi based on labels specified on the vmi /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:292 Unexpected Warning event received: testvmipfhc7,18b09079-963b-11e8-8bc5-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmipfhc7/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T10:02:00.569950Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmipfhc7 kind=VirtualMachineInstance uid=18b09079-963b-11e8-8bc5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmipfhc7-rzsgv" level=info timestamp=2018-08-02T10:02:19.163530Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmipfhc7 kind=VirtualMachineInstance uid=18b09079-963b-11e8-8bc5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmipfhc7-rzsgv" level=error timestamp=2018-08-02T10:02:19.164255Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmipfhc7 kind=VirtualMachineInstance uid=18b09079-963b-11e8-8bc5-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipfhc7/sock: connect: no such file or directory" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running 2018/08/02 10:04:23 http: TLS handshake error from 10.244.1.1:36298: EOF 2018/08/02 10:04:33 http: TLS handshake error from 10.244.1.1:36304: EOF level=info timestamp=2018-08-02T10:04:34.701300Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:04:34.725758Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:04:39.454267Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:04:40.837396Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:04:43 http: TLS handshake error from 10.244.1.1:36310: EOF level=info timestamp=2018-08-02T10:04:46.423784Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:04:53 http: TLS handshake error from 10.244.1.1:36316: EOF 2018/08/02 10:05:03 http: TLS handshake error from 10.244.1.1:36322: EOF level=info timestamp=2018-08-02T10:05:04.803354Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:05:04.803620Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:05:09.499382Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:05:10.865907Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:05:13 http: TLS handshake error from 10.244.1.1:36328: EOF Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T10:01:59.880676Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiht5hn kind= uid=18b3ce05-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:01:59.880784Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiht5hn kind= uid=18b3ce05-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:01:59.889746Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7m4pb kind= uid=18b573ec-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:01:59.889798Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7m4pb kind= uid=18b573ec-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:01:59.960562Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipfhc7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipfhc7" level=info timestamp=2018-08-02T10:01:59.961031Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiht5hn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiht5hn" level=info timestamp=2018-08-02T10:02:00.719021Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipfhc7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipfhc7" level=info timestamp=2018-08-02T10:05:01.353398Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:05:01.354598Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:05:01.361151Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi88z74 kind= uid=84dffb66-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:05:01.361326Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi88z74 kind= uid=84dffb66-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:05:01.373300Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wpnf kind= uid=84e1fef4-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:05:01.373398Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wpnf kind= uid=84e1fef4-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:05:01.383572Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmix7bb8 kind= uid=84e33f17-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:05:01.383616Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmix7bb8 kind= uid=84e33f17-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T10:05:15.243410Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmikgcr7" level=info timestamp=2018-08-02T10:05:15.263593Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmikgcr7, existing: true\n" level=info timestamp=2018-08-02T10:05:15.263738Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:05:15.263765Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:05:15.263844Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:05:15.264109Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock" level=error timestamp=2018-08-02T10:05:15.264192Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:05:15.268958Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmikgcr7" level=info timestamp=2018-08-02T10:05:15.309296Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmikgcr7, existing: true\n" level=info timestamp=2018-08-02T10:05:15.309383Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:05:15.309411Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:05:15.309505Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:05:15.309688Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock" level=error timestamp=2018-08-02T10:05:15.309770Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:05:15.309843Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmikgcr7" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T10:05:00.673362Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:05:00.673411Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmipfhc7 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:05:00.673476Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmipfhc7 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:05:02.529038Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipfhc7, existing: false\n" level=info timestamp=2018-08-02T10:05:02.534531Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:05:02.534758Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmipfhc7 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:05:02.534850Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmipfhc7 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:05:03.645315Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi7m4pb, existing: false\n" level=info timestamp=2018-08-02T10:05:03.645398Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:05:03.645457Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi7m4pb kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:05:03.645522Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi7m4pb kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:05:03.765930Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiht5hn, existing: false\n" level=info timestamp=2018-08-02T10:05:03.766010Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:05:03.766074Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiht5hn kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:05:03.766141Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiht5hn kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi4wpnf-5gftq Pod phase: Running level=info timestamp=2018-08-02T10:05:07.346810Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:05:07.347093Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:05:07.350301Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" Pod name: virt-launcher-testvmi7m4pb-584jb Pod phase: Failed Pod name: virt-launcher-testvmi88z74-9kpwb Pod phase: Running level=info timestamp=2018-08-02T10:05:07.026506Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:05:07.026814Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:05:07.038532Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" Pod name: virt-launcher-testvmikgcr7-hx49p Pod phase: Running level=info timestamp=2018-08-02T10:05:04.814896Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:05:04.815516Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:05:04.817996Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:05:14.824270Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:05:14.896292Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmikgcr7" level=info timestamp=2018-08-02T10:05:14.898615Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:05:14.898765Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmix7bb8-vd7hq Pod phase: Running level=info timestamp=2018-08-02T10:05:07.925325Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:05:07.926299Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:05:07.929046Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T10:07:04.983864Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:07:04.993705Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:07:09.456578Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:07:10.958409Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:07:13 http: TLS handshake error from 10.244.1.1:36400: EOF level=info timestamp=2018-08-02T10:07:16.693443Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:07:23 http: TLS handshake error from 10.244.1.1:36406: EOF 2018/08/02 10:07:33 http: TLS handshake error from 10.244.1.1:36412: EOF level=info timestamp=2018-08-02T10:07:35.035358Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:07:35.046993Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:07:39.483144Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:07:40.979226Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:07:43 http: TLS handshake error from 10.244.1.1:36418: EOF level=info timestamp=2018-08-02T10:07:46.746885Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:07:53 http: TLS handshake error from 10.244.1.1:36424: EOF Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T10:01:59.880676Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiht5hn kind= uid=18b3ce05-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:01:59.880784Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiht5hn kind= uid=18b3ce05-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:01:59.889746Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7m4pb kind= uid=18b573ec-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:01:59.889798Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7m4pb kind= uid=18b573ec-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:01:59.960562Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipfhc7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipfhc7" level=info timestamp=2018-08-02T10:01:59.961031Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiht5hn\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiht5hn" level=info timestamp=2018-08-02T10:02:00.719021Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipfhc7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipfhc7" level=info timestamp=2018-08-02T10:05:01.353398Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:05:01.354598Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:05:01.361151Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi88z74 kind= uid=84dffb66-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:05:01.361326Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi88z74 kind= uid=84dffb66-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:05:01.373300Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wpnf kind= uid=84e1fef4-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:05:01.373398Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wpnf kind= uid=84e1fef4-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:05:01.383572Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmix7bb8 kind= uid=84e33f17-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:05:01.383616Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmix7bb8 kind= uid=84e33f17-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T10:06:37.159865Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:06:37.160194Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock" level=error timestamp=2018-08-02T10:06:37.160264Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:06:37.160347Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmikgcr7" level=info timestamp=2018-08-02T10:07:42.568947Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmicrcrg, existing: false\n" level=info timestamp=2018-08-02T10:07:42.569033Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:07:42.569106Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmicrcrg kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:07:42.569212Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmicrcrg kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:07:59.081253Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmikgcr7, existing: true\n" level=info timestamp=2018-08-02T10:07:59.081326Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:07:59.081349Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:07:59.081423Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:07:59.081663Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock" level=error timestamp=2018-08-02T10:07:59.081732Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:07:59.081799Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmikgcr7" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T10:06:42.312413Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi88z74/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi88z74" level=info timestamp=2018-08-02T10:06:42.784394Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi4wpnf, existing: true\n" level=info timestamp=2018-08-02T10:06:42.784491Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:06:42.784522Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:06:42.784611Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi4wpnf kind= uid=84e1fef4-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:06:42.784918Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi4wpnf kind= uid=84e1fef4-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi4wpnf/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi4wpnf/sock" level=error timestamp=2018-08-02T10:06:42.784997Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi4wpnf kind= uid=84e1fef4-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi4wpnf/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:06:42.785078Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi4wpnf/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi4wpnf" level=info timestamp=2018-08-02T10:06:44.305572Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmix7bb8, existing: true\n" level=info timestamp=2018-08-02T10:06:44.305672Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:06:44.305701Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:06:44.305775Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmix7bb8 kind= uid=84e33f17-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:06:44.306055Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmix7bb8 kind= uid=84e33f17-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmix7bb8/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmix7bb8/sock" level=error timestamp=2018-08-02T10:06:44.306120Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmix7bb8 kind= uid=84e33f17-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmix7bb8/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:06:44.306190Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmix7bb8/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmix7bb8" Pod name: virt-launcher-testvmi4wpnf-5gftq Pod phase: Running level=info timestamp=2018-08-02T10:05:07.346810Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:05:07.347093Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:05:07.350301Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:05:19.589363Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:05:19.645430Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi4wpnf" level=info timestamp=2018-08-02T10:05:19.647993Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:05:19.648174Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmi88z74-9kpwb Pod phase: Running level=info timestamp=2018-08-02T10:05:07.026506Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:05:07.026814Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:05:07.038532Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:05:19.832797Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:05:19.858786Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi88z74" level=info timestamp=2018-08-02T10:05:19.860533Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:05:19.860724Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmikgcr7-hx49p Pod phase: Running level=info timestamp=2018-08-02T10:05:04.814896Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:05:04.815516Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:05:04.817996Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:05:14.824270Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:05:14.896292Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmikgcr7" level=info timestamp=2018-08-02T10:05:14.898615Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:05:14.898765Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmix7bb8-vd7hq Pod phase: Running level=info timestamp=2018-08-02T10:05:07.925325Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:05:07.926299Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:05:07.929046Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:05:20.516075Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:05:20.533014Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmix7bb8" level=info timestamp=2018-08-02T10:05:20.534625Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:05:20.534776Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [181.507 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 with a service matching the vmi exposed [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:272 should fail to reach the vmi if an invalid servicename is used /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:303 Unexpected Warning event received: testvmikgcr7,84de5d89-963b-11e8-8bc5-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T10:05:02.145928Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmikgcr7 kind=VirtualMachineInstance uid=84de5d89-963b-11e8-8bc5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmikgcr7-hx49p" level=info timestamp=2018-08-02T10:05:15.935372Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmikgcr7 kind=VirtualMachineInstance uid=84de5d89-963b-11e8-8bc5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmikgcr7-hx49p" level=error timestamp=2018-08-02T10:05:15.946932Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmikgcr7 kind=VirtualMachineInstance uid=84de5d89-963b-11e8-8bc5-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikgcr7/sock: connect: no such file or directory" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T10:07:40.979226Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:07:43 http: TLS handshake error from 10.244.1.1:36418: EOF level=info timestamp=2018-08-02T10:07:46.746885Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:07:53 http: TLS handshake error from 10.244.1.1:36424: EOF 2018/08/02 10:08:03 http: TLS handshake error from 10.244.1.1:36430: EOF level=info timestamp=2018-08-02T10:08:05.089258Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:08:05.102726Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:08:08.664027Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:08:08.669544Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:08:09.950399Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:08:11.010056Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:08:13 http: TLS handshake error from 10.244.1.1:36436: EOF level=info timestamp=2018-08-02T10:08:15.234471Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:08:15.236840Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:08:16.800527Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T10:05:01.354598Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:05:01.361151Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi88z74 kind= uid=84dffb66-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:05:01.361326Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi88z74 kind= uid=84dffb66-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:05:01.373300Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wpnf kind= uid=84e1fef4-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:05:01.373398Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wpnf kind= uid=84e1fef4-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:05:01.383572Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmix7bb8 kind= uid=84e33f17-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:05:01.383616Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmix7bb8 kind= uid=84e33f17-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:08:02.862363Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv6n8l kind= uid=f10e3b54-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:08:02.863583Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv6n8l kind= uid=f10e3b54-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:08:02.872013Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwphdl kind= uid=f10ff845-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:08:02.872117Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwphdl kind= uid=f10ff845-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:08:02.888056Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:08:02.888120Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:08:02.908027Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9rxdh kind= uid=f114263d-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:08:02.908080Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9rxdh kind= uid=f114263d-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T10:08:19.475000Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmird6c7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmird6c7" level=info timestamp=2018-08-02T10:08:20.115233Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmird6c7, existing: true\n" level=info timestamp=2018-08-02T10:08:20.115322Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:08:20.115354Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:08:20.115446Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:08:20.115782Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmird6c7/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmird6c7/sock" level=error timestamp=2018-08-02T10:08:20.115873Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmird6c7/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:08:20.116010Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmird6c7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmird6c7" level=info timestamp=2018-08-02T10:08:21.396291Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmird6c7, existing: true\n" level=info timestamp=2018-08-02T10:08:21.396374Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:08:21.396403Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:08:21.396495Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:08:21.396790Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmird6c7/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmird6c7/sock" level=error timestamp=2018-08-02T10:08:21.396888Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmird6c7/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:08:21.397174Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmird6c7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmird6c7" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T10:08:23.626774Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiv6n8l/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiv6n8l" level=info timestamp=2018-08-02T10:08:23.646919Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiv6n8l, existing: true\n" level=info timestamp=2018-08-02T10:08:23.646951Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:08:23.646972Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:08:23.647032Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiv6n8l kind= uid=f10e3b54-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:08:23.647133Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiv6n8l kind= uid=f10e3b54-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiv6n8l/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiv6n8l/sock" level=error timestamp=2018-08-02T10:08:23.647186Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiv6n8l kind= uid=f10e3b54-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiv6n8l/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:08:23.647253Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiv6n8l/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiv6n8l" level=info timestamp=2018-08-02T10:08:23.687468Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiv6n8l, existing: true\n" level=info timestamp=2018-08-02T10:08:23.687533Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:08:23.687558Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:08:23.687677Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiv6n8l kind= uid=f10e3b54-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:08:23.687863Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiv6n8l kind= uid=f10e3b54-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiv6n8l/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiv6n8l/sock" level=error timestamp=2018-08-02T10:08:23.687926Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiv6n8l kind= uid=f10e3b54-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiv6n8l/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:08:23.688000Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiv6n8l/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiv6n8l" Pod name: virt-launcher-testvmi9rxdh-vqksf Pod phase: Running level=info timestamp=2018-08-02T10:08:09.490507Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:08:09.493305Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:08:09.495057Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:08:21.487321Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:08:21.555320Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi9rxdh" level=info timestamp=2018-08-02T10:08:21.560181Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:08:21.560336Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmird6c7-nmmrl Pod phase: Running level=info timestamp=2018-08-02T10:08:06.787632Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:08:06.788458Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:08:06.790401Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:08:16.795769Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:08:16.833821Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmird6c7" level=info timestamp=2018-08-02T10:08:16.835410Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:08:16.835553Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiv6n8l-sgbp6 Pod phase: Running level=info timestamp=2018-08-02T10:08:09.132005Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:08:09.133117Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:08:09.134744Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:08:21.862814Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:08:21.890633Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiv6n8l" level=info timestamp=2018-08-02T10:08:21.892040Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:08:21.892145Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiwphdl-wtn2v Pod phase: Running level=info timestamp=2018-08-02T10:08:08.097310Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:08:08.101725Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:08:08.103527Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:08:20.934689Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:08:21.007453Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiwphdl" level=info timestamp=2018-08-02T10:08:21.009171Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:08:21.009345Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T10:10:05.300493Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:10:05.312760Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:10:09.473268Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:10:11.092485Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:10:13 http: TLS handshake error from 10.244.1.1:36508: EOF level=info timestamp=2018-08-02T10:10:17.029702Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:10:23 http: TLS handshake error from 10.244.1.1:36514: EOF 2018/08/02 10:10:33 http: TLS handshake error from 10.244.1.1:36520: EOF level=info timestamp=2018-08-02T10:10:35.353253Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:10:35.373063Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:10:39.497032Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:10:41.110749Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:10:43 http: TLS handshake error from 10.244.1.1:36526: EOF level=info timestamp=2018-08-02T10:10:47.084257Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:10:53 http: TLS handshake error from 10.244.1.1:36532: EOF Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T10:05:01.354598Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikgcr7 kind= uid=84de5d89-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:05:01.361151Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi88z74 kind= uid=84dffb66-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:05:01.361326Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi88z74 kind= uid=84dffb66-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:05:01.373300Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wpnf kind= uid=84e1fef4-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:05:01.373398Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wpnf kind= uid=84e1fef4-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:05:01.383572Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmix7bb8 kind= uid=84e33f17-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:05:01.383616Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmix7bb8 kind= uid=84e33f17-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:08:02.862363Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv6n8l kind= uid=f10e3b54-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:08:02.863583Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiv6n8l kind= uid=f10e3b54-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:08:02.872013Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwphdl kind= uid=f10ff845-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:08:02.872117Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwphdl kind= uid=f10ff845-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:08:02.888056Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:08:02.888120Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:08:02.908027Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9rxdh kind= uid=f114263d-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:08:02.908080Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9rxdh kind= uid=f114263d-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T10:09:40.762023Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:09:40.762255Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmird6c7/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmird6c7/sock" level=error timestamp=2018-08-02T10:09:40.762326Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmird6c7/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:09:40.762398Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmird6c7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmird6c7" level=info timestamp=2018-08-02T10:10:42.922056Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmikgcr7, existing: false\n" level=info timestamp=2018-08-02T10:10:42.922143Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:10:42.922222Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmikgcr7 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:10:42.922313Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmikgcr7 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:11:02.689669Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmird6c7, existing: true\n" level=info timestamp=2018-08-02T10:11:02.689761Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:11:02.689788Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:11:02.689863Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:11:02.690236Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmird6c7/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmird6c7/sock" level=error timestamp=2018-08-02T10:11:02.690314Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmird6c7/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:11:02.690391Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmird6c7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmird6c7" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T10:09:43.234348Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiwphdl/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiwphdl" level=info timestamp=2018-08-02T10:09:44.270799Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi9rxdh, existing: true\n" level=info timestamp=2018-08-02T10:09:44.271058Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:09:44.271245Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:09:44.271909Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi9rxdh kind= uid=f114263d-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:09:44.272859Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi9rxdh kind= uid=f114263d-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9rxdh/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi9rxdh/sock" level=error timestamp=2018-08-02T10:09:44.272944Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi9rxdh kind= uid=f114263d-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9rxdh/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:09:44.273051Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9rxdh/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi9rxdh" level=info timestamp=2018-08-02T10:09:45.538807Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiv6n8l, existing: true\n" level=info timestamp=2018-08-02T10:09:45.538902Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:09:45.538925Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:09:45.538997Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiv6n8l kind= uid=f10e3b54-963b-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:09:45.539159Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiv6n8l kind= uid=f10e3b54-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiv6n8l/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiv6n8l/sock" level=error timestamp=2018-08-02T10:09:45.539218Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiv6n8l kind= uid=f10e3b54-963b-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiv6n8l/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:09:45.539293Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiv6n8l/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiv6n8l" Pod name: virt-launcher-testvmi9rxdh-vqksf Pod phase: Running level=info timestamp=2018-08-02T10:08:09.490507Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:08:09.493305Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:08:09.495057Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:08:21.487321Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:08:21.555320Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi9rxdh" level=info timestamp=2018-08-02T10:08:21.560181Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:08:21.560336Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmird6c7-nmmrl Pod phase: Running level=info timestamp=2018-08-02T10:08:06.787632Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:08:06.788458Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:08:06.790401Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:08:16.795769Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:08:16.833821Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmird6c7" level=info timestamp=2018-08-02T10:08:16.835410Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:08:16.835553Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiv6n8l-sgbp6 Pod phase: Running level=info timestamp=2018-08-02T10:08:09.132005Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:08:09.133117Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:08:09.134744Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:08:21.862814Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:08:21.890633Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiv6n8l" level=info timestamp=2018-08-02T10:08:21.892040Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:08:21.892145Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiwphdl-wtn2v Pod phase: Running level=info timestamp=2018-08-02T10:08:08.097310Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:08:08.101725Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:08:08.103527Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:08:20.934689Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:08:21.007453Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiwphdl" level=info timestamp=2018-08-02T10:08:21.009171Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:08:21.009345Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [181.526 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 with a subdomain and a headless service given [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:319 should be able to reach the vmi via its unique fully qualified domain name /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:342 Unexpected Warning event received: testvmiv6n8l,f10e3b54-963b-11e8-8bc5-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmiv6n8l/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T10:08:03.476559Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiv6n8l kind=VirtualMachineInstance uid=f10e3b54-963b-11e8-8bc5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmiv6n8l-sgbp6" level=info timestamp=2018-08-02T10:08:24.096022Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiv6n8l kind=VirtualMachineInstance uid=f10e3b54-963b-11e8-8bc5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmiv6n8l-sgbp6" level=error timestamp=2018-08-02T10:08:24.104384Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmiv6n8l kind=VirtualMachineInstance uid=f10e3b54-963b-11e8-8bc5-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiv6n8l/sock: connect: no such file or directory" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T10:10:35.373063Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:10:39.497032Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:10:41.110749Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:10:43 http: TLS handshake error from 10.244.1.1:36526: EOF level=info timestamp=2018-08-02T10:10:47.084257Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:10:53 http: TLS handshake error from 10.244.1.1:36532: EOF 2018/08/02 10:11:03 http: TLS handshake error from 10.244.1.1:36538: EOF level=info timestamp=2018-08-02T10:11:05.422588Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:11:05.434212Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:11:09.475350Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:11:11.134023Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:11:13 http: TLS handshake error from 10.244.1.1:36544: EOF level=info timestamp=2018-08-02T10:11:15.242943Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:11:15.244849Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:11:17.138059Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T10:08:02.872013Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwphdl kind= uid=f10ff845-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:08:02.872117Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwphdl kind= uid=f10ff845-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:08:02.888056Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:08:02.888120Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:08:02.908027Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9rxdh kind= uid=f114263d-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:08:02.908080Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9rxdh kind= uid=f114263d-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:11:04.390481Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8hqpv kind= uid=5d41ae67-963c-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:11:04.391522Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8hqpv kind= uid=5d41ae67-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:11:04.402021Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqxxw9 kind= uid=5d433416-963c-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:11:04.402174Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqxxw9 kind= uid=5d433416-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:11:04.414365Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2b8l6 kind= uid=5d451c0d-963c-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:11:04.414487Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2b8l6 kind= uid=5d451c0d-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:11:04.425850Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq5nkk kind= uid=5d46de30-963c-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:11:04.425958Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq5nkk kind= uid=5d46de30-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:11:04.516505Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2b8l6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2b8l6" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T10:11:20.292811Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi2b8l6/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi2b8l6" level=info timestamp=2018-08-02T10:11:20.933052Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi2b8l6, existing: true\n" level=info timestamp=2018-08-02T10:11:20.933136Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:11:20.933166Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:11:20.933289Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi2b8l6 kind= uid=5d451c0d-963c-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:11:20.933612Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi2b8l6 kind= uid=5d451c0d-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi2b8l6/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi2b8l6/sock" level=error timestamp=2018-08-02T10:11:20.933693Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi2b8l6 kind= uid=5d451c0d-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi2b8l6/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:11:20.933785Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi2b8l6/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi2b8l6" level=info timestamp=2018-08-02T10:11:22.214026Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi2b8l6, existing: true\n" level=info timestamp=2018-08-02T10:11:22.214108Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:11:22.214137Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:11:22.214280Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi2b8l6 kind= uid=5d451c0d-963c-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:11:22.214475Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi2b8l6 kind= uid=5d451c0d-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi2b8l6/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi2b8l6/sock" level=error timestamp=2018-08-02T10:11:22.214545Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi2b8l6 kind= uid=5d451c0d-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi2b8l6/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:11:22.214626Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi2b8l6/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi2b8l6" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T10:11:23.265443Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi8hqpv/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi8hqpv" level=info timestamp=2018-08-02T10:11:23.287057Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi8hqpv, existing: true\n" level=info timestamp=2018-08-02T10:11:23.287097Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:11:23.287120Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:11:23.287180Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi8hqpv kind= uid=5d41ae67-963c-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:11:23.287299Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi8hqpv kind= uid=5d41ae67-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi8hqpv/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi8hqpv/sock" level=error timestamp=2018-08-02T10:11:23.287345Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi8hqpv kind= uid=5d41ae67-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi8hqpv/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:11:23.287411Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi8hqpv/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi8hqpv" level=info timestamp=2018-08-02T10:11:23.335040Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi8hqpv, existing: true\n" level=info timestamp=2018-08-02T10:11:23.335113Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:11:23.335136Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:11:23.335226Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi8hqpv kind= uid=5d41ae67-963c-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:11:23.335499Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi8hqpv kind= uid=5d41ae67-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi8hqpv/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi8hqpv/sock" level=error timestamp=2018-08-02T10:11:23.335564Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi8hqpv kind= uid=5d41ae67-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi8hqpv/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:11:23.335635Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi8hqpv/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi8hqpv" Pod name: virt-launcher-testvmi2b8l6-6zrls Pod phase: Running level=info timestamp=2018-08-02T10:11:08.091201Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:11:08.093444Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:11:08.095810Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:11:18.107167Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:11:18.133344Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi2b8l6" level=info timestamp=2018-08-02T10:11:18.136230Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:11:18.136388Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmi8hqpv-d4f8d Pod phase: Running level=info timestamp=2018-08-02T10:11:10.163938Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:11:10.164869Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:11:10.172216Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:11:21.627428Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:11:21.686858Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi8hqpv" level=info timestamp=2018-08-02T10:11:21.693201Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:11:21.693631Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiq5nkk-srt95 Pod phase: Running level=info timestamp=2018-08-02T10:11:10.228266Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:11:10.231017Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:11:10.237515Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:11:23.425878Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:11:23.442774Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiq5nkk" level=info timestamp=2018-08-02T10:11:23.444529Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:11:23.444689Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiqxxw9-q8dmw Pod phase: Running level=info timestamp=2018-08-02T10:11:10.118589Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:11:10.127174Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:11:10.132606Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:11:22.414620Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:11:22.455193Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiqxxw9" level=info timestamp=2018-08-02T10:11:22.466820Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:11:22.467069Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T10:13:11.234070Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:13:13 http: TLS handshake error from 10.244.1.1:36616: EOF level=info timestamp=2018-08-02T10:13:14.620786Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:13:14.621770Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:13:17.357361Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:13:23 http: TLS handshake error from 10.244.1.1:36622: EOF 2018/08/02 10:13:33 http: TLS handshake error from 10.244.1.1:36628: EOF level=info timestamp=2018-08-02T10:13:35.691926Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:13:35.716225Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:13:39.453230Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:13:41.255937Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:13:43 http: TLS handshake error from 10.244.1.1:36634: EOF level=info timestamp=2018-08-02T10:13:47.410953Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:13:53 http: TLS handshake error from 10.244.1.1:36640: EOF 2018/08/02 10:14:03 http: TLS handshake error from 10.244.1.1:36646: EOF Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T10:08:02.872013Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwphdl kind= uid=f10ff845-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:08:02.872117Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwphdl kind= uid=f10ff845-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:08:02.888056Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:08:02.888120Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmird6c7 kind= uid=f112bbc1-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:08:02.908027Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9rxdh kind= uid=f114263d-963b-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:08:02.908080Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9rxdh kind= uid=f114263d-963b-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:11:04.390481Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8hqpv kind= uid=5d41ae67-963c-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:11:04.391522Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8hqpv kind= uid=5d41ae67-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:11:04.402021Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqxxw9 kind= uid=5d433416-963c-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:11:04.402174Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqxxw9 kind= uid=5d433416-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:11:04.414365Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2b8l6 kind= uid=5d451c0d-963c-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:11:04.414487Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2b8l6 kind= uid=5d451c0d-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:11:04.425850Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq5nkk kind= uid=5d46de30-963c-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:11:04.425958Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq5nkk kind= uid=5d46de30-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:11:04.516505Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2b8l6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2b8l6" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T10:12:41.578893Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi2b8l6 kind= uid=5d451c0d-963c-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:12:41.579310Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi2b8l6 kind= uid=5d451c0d-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi2b8l6/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi2b8l6/sock" level=error timestamp=2018-08-02T10:12:41.579375Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi2b8l6 kind= uid=5d451c0d-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi2b8l6/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:12:41.579447Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi2b8l6/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi2b8l6" level=info timestamp=2018-08-02T10:13:46.530624Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmird6c7, existing: false\n" level=info timestamp=2018-08-02T10:13:46.530721Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:13:46.530805Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmird6c7 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:13:46.530885Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmird6c7 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:14:03.501006Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi2b8l6, existing: true\n" level=info timestamp=2018-08-02T10:14:03.501103Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:14:03.501135Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:14:03.501231Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi2b8l6 kind= uid=5d451c0d-963c-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:14:03.501551Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi2b8l6 kind= uid=5d451c0d-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi2b8l6/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi2b8l6/sock" level=error timestamp=2018-08-02T10:14:03.501631Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi2b8l6 kind= uid=5d451c0d-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi2b8l6/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:14:03.501715Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi2b8l6/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi2b8l6" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T10:12:44.840090Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiqxxw9/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiqxxw9" level=info timestamp=2018-08-02T10:12:45.186584Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi8hqpv, existing: true\n" level=info timestamp=2018-08-02T10:12:45.186676Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:12:45.186708Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:12:45.186783Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi8hqpv kind= uid=5d41ae67-963c-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:12:45.187032Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi8hqpv kind= uid=5d41ae67-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi8hqpv/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi8hqpv/sock" level=error timestamp=2018-08-02T10:12:45.187089Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi8hqpv kind= uid=5d41ae67-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi8hqpv/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:12:45.187154Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi8hqpv/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi8hqpv" level=info timestamp=2018-08-02T10:12:46.650742Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiq5nkk, existing: true\n" level=info timestamp=2018-08-02T10:12:46.650838Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:12:46.650862Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:12:46.650935Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiq5nkk kind= uid=5d46de30-963c-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:12:46.651113Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiq5nkk kind= uid=5d46de30-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiq5nkk/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiq5nkk/sock" level=error timestamp=2018-08-02T10:12:46.651179Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiq5nkk kind= uid=5d46de30-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiq5nkk/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:12:46.651248Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiq5nkk/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiq5nkk" Pod name: virt-launcher-testvmi2b8l6-6zrls Pod phase: Running level=info timestamp=2018-08-02T10:11:08.091201Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:11:08.093444Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:11:08.095810Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:11:18.107167Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:11:18.133344Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi2b8l6" level=info timestamp=2018-08-02T10:11:18.136230Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:11:18.136388Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmi8hqpv-d4f8d Pod phase: Running level=info timestamp=2018-08-02T10:11:10.163938Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:11:10.164869Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:11:10.172216Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:11:21.627428Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:11:21.686858Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi8hqpv" level=info timestamp=2018-08-02T10:11:21.693201Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:11:21.693631Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiq5nkk-srt95 Pod phase: Running level=info timestamp=2018-08-02T10:11:10.228266Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:11:10.231017Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:11:10.237515Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:11:23.425878Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:11:23.442774Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiq5nkk" level=info timestamp=2018-08-02T10:11:23.444529Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:11:23.444689Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmiqxxw9-q8dmw Pod phase: Running level=info timestamp=2018-08-02T10:11:10.118589Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:11:10.127174Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:11:10.132606Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:11:22.414620Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:11:22.455193Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiqxxw9" level=info timestamp=2018-08-02T10:11:22.466820Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:11:22.467069Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [181.505 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with custom interface model [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:368 should expose the right device type to the guest /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:369 Unexpected Warning event received: testvmi8hqpv,5d41ae67-963c-11e8-8bc5-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmi8hqpv/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T10:11:05.038153Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi8hqpv kind=VirtualMachineInstance uid=5d41ae67-963c-11e8-8bc5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi8hqpv-d4f8d" level=info timestamp=2018-08-02T10:11:23.732389Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi8hqpv kind=VirtualMachineInstance uid=5d41ae67-963c-11e8-8bc5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi8hqpv-d4f8d" level=error timestamp=2018-08-02T10:11:23.744042Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi8hqpv kind=VirtualMachineInstance uid=5d41ae67-963c-11e8-8bc5-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi8hqpv/sock: connect: no such file or directory" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T10:13:39.453230Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:13:41.255937Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:13:43 http: TLS handshake error from 10.244.1.1:36634: EOF level=info timestamp=2018-08-02T10:13:47.410953Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:13:53 http: TLS handshake error from 10.244.1.1:36640: EOF 2018/08/02 10:14:03 http: TLS handshake error from 10.244.1.1:36646: EOF level=info timestamp=2018-08-02T10:14:05.747569Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:14:05.781742Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:14:09.463841Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:14:11.276110Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:14:13 http: TLS handshake error from 10.244.1.1:36652: EOF level=info timestamp=2018-08-02T10:14:14.625104Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:14:14.627781Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:14:17.469486Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:14:23 http: TLS handshake error from 10.244.1.1:36658: EOF Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T10:11:04.425958Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq5nkk kind= uid=5d46de30-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:11:04.516505Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2b8l6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2b8l6" level=info timestamp=2018-08-02T10:14:05.892192Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi88d94 kind= uid=c970bb0d-963c-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:14:05.893312Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi88d94 kind= uid=c970bb0d-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:14:05.896678Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipqrhb kind= uid=c9720e74-963c-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:14:05.896756Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipqrhb kind= uid=c9720e74-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:14:05.909493Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitkfl6 kind= uid=c9730a61-963c-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:14:05.909576Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitkfl6 kind= uid=c9730a61-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:14:05.935716Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmip7gmx kind= uid=c974ae42-963c-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:14:05.935792Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmip7gmx kind= uid=c974ae42-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:14:05.985532Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipqrhb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipqrhb" level=info timestamp=2018-08-02T10:14:05.992526Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmitkfl6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmitkfl6" level=info timestamp=2018-08-02T10:14:06.734691Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipqrhb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipqrhb" level=info timestamp=2018-08-02T10:14:07.098738Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmitkfl6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmitkfl6" level=info timestamp=2018-08-02T10:14:07.297810Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmip7gmx\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmip7gmx" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T10:14:22.259585Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitkfl6/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmitkfl6" level=info timestamp=2018-08-02T10:14:22.899871Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmitkfl6, existing: true\n" level=info timestamp=2018-08-02T10:14:22.899985Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:14:22.900016Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:14:22.900122Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmitkfl6 kind= uid=c9730a61-963c-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:14:22.900483Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmitkfl6 kind= uid=c9730a61-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitkfl6/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmitkfl6/sock" level=error timestamp=2018-08-02T10:14:22.900584Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmitkfl6 kind= uid=c9730a61-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitkfl6/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:14:22.900681Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitkfl6/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmitkfl6" level=info timestamp=2018-08-02T10:14:24.180955Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmitkfl6, existing: true\n" level=info timestamp=2018-08-02T10:14:24.181049Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:14:24.181080Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:14:24.181185Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmitkfl6 kind= uid=c9730a61-963c-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:14:24.181549Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmitkfl6 kind= uid=c9730a61-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitkfl6/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmitkfl6/sock" level=error timestamp=2018-08-02T10:14:24.181623Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmitkfl6 kind= uid=c9730a61-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitkfl6/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:14:24.181733Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitkfl6/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmitkfl6" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T10:14:25.712289Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi88d94/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi88d94" level=info timestamp=2018-08-02T10:14:25.811522Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipqrhb, existing: true\n" level=info timestamp=2018-08-02T10:14:25.811612Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:14:25.811637Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:14:25.811755Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmipqrhb kind= uid=c9720e74-963c-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:14:25.811933Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmipqrhb kind= uid=c9720e74-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipqrhb/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmipqrhb/sock" level=error timestamp=2018-08-02T10:14:25.812007Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmipqrhb kind= uid=c9720e74-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipqrhb/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:14:25.812096Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipqrhb/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmipqrhb" level=info timestamp=2018-08-02T10:14:25.872493Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi88d94, existing: true\n" level=info timestamp=2018-08-02T10:14:25.872528Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:14:25.872551Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:14:25.872601Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi88d94 kind= uid=c970bb0d-963c-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:14:25.872748Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi88d94 kind= uid=c970bb0d-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi88d94/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi88d94/sock" level=error timestamp=2018-08-02T10:14:25.872801Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi88d94 kind= uid=c970bb0d-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi88d94/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:14:25.872875Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi88d94/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi88d94" Pod name: virt-launcher-testvmi88d94-8x62h Pod phase: Running level=info timestamp=2018-08-02T10:14:11.629670Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:14:11.630439Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:14:11.633016Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:14:24.649515Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:14:24.673572Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi88d94" level=info timestamp=2018-08-02T10:14:24.675773Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:14:24.675998Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmip7gmx-k5mbr Pod phase: Running level=info timestamp=2018-08-02T10:14:12.411486Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:14:12.411754Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:14:12.417607Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:14:24.219574Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:14:24.241916Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmip7gmx" level=info timestamp=2018-08-02T10:14:24.243393Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:14:24.243600Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmipqrhb-v4jj6 Pod phase: Running level=info timestamp=2018-08-02T10:14:11.110111Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:14:11.112807Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:14:11.114571Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:14:23.075197Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:14:23.145300Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmipqrhb" level=info timestamp=2018-08-02T10:14:23.157175Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:14:23.157391Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmitkfl6-5b7nm Pod phase: Running level=info timestamp=2018-08-02T10:14:09.760283Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:14:09.760557Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:14:09.772242Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:14:19.781390Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:14:19.802955Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmitkfl6" level=info timestamp=2018-08-02T10:14:19.805066Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:14:19.805245Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-bbjxx Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-ssdb6 Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79975b94-9j92f Pod phase: Running level=info timestamp=2018-08-02T10:16:11.365213Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:16:13 http: TLS handshake error from 10.244.1.1:36724: EOF level=info timestamp=2018-08-02T10:16:14.640577Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:16:14.643879Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:16:17.690394Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:16:23 http: TLS handshake error from 10.244.1.1:36730: EOF 2018/08/02 10:16:33 http: TLS handshake error from 10.244.1.1:36736: EOF level=info timestamp=2018-08-02T10:16:36.021013Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:16:36.026771Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:16:39.494828Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:16:41.386437Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:16:43 http: TLS handshake error from 10.244.1.1:36742: EOF level=info timestamp=2018-08-02T10:16:47.745928Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:16:53 http: TLS handshake error from 10.244.1.1:36748: EOF 2018/08/02 10:17:03 http: TLS handshake error from 10.244.1.1:36754: EOF Pod name: virt-controller-67dcdd8464-w2xzj Pod phase: Running level=info timestamp=2018-08-02T09:35:45.275673Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-67dcdd8464-wvph9 Pod phase: Running level=info timestamp=2018-08-02T10:11:04.425958Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiq5nkk kind= uid=5d46de30-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:11:04.516505Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2b8l6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2b8l6" level=info timestamp=2018-08-02T10:14:05.892192Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi88d94 kind= uid=c970bb0d-963c-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:14:05.893312Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi88d94 kind= uid=c970bb0d-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:14:05.896678Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipqrhb kind= uid=c9720e74-963c-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:14:05.896756Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipqrhb kind= uid=c9720e74-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:14:05.909493Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitkfl6 kind= uid=c9730a61-963c-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:14:05.909576Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitkfl6 kind= uid=c9730a61-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:14:05.935716Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmip7gmx kind= uid=c974ae42-963c-11e8-8bc5-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:14:05.935792Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmip7gmx kind= uid=c974ae42-963c-11e8-8bc5-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:14:05.985532Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipqrhb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipqrhb" level=info timestamp=2018-08-02T10:14:05.992526Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmitkfl6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmitkfl6" level=info timestamp=2018-08-02T10:14:06.734691Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipqrhb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipqrhb" level=info timestamp=2018-08-02T10:14:07.098738Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmitkfl6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmitkfl6" level=info timestamp=2018-08-02T10:14:07.297810Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmip7gmx\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmip7gmx" Pod name: virt-handler-92p79 Pod phase: Running level=info timestamp=2018-08-02T10:15:43.553759Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmitkfl6 kind= uid=c9730a61-963c-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:15:43.554073Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmitkfl6 kind= uid=c9730a61-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitkfl6/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmitkfl6/sock" level=error timestamp=2018-08-02T10:15:43.554152Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmitkfl6 kind= uid=c9730a61-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitkfl6/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:15:43.554234Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitkfl6/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmitkfl6" level=info timestamp=2018-08-02T10:16:47.342222Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi2b8l6, existing: false\n" level=info timestamp=2018-08-02T10:16:47.342324Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:16:47.342394Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi2b8l6 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:16:47.342492Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi2b8l6 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:17:05.476109Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmitkfl6, existing: true\n" level=info timestamp=2018-08-02T10:17:05.476183Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:17:05.476212Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:17:05.476298Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmitkfl6 kind= uid=c9730a61-963c-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:17:05.476585Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmitkfl6 kind= uid=c9730a61-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitkfl6/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmitkfl6/sock" level=error timestamp=2018-08-02T10:17:05.476661Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmitkfl6 kind= uid=c9730a61-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitkfl6/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:17:05.476747Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitkfl6/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmitkfl6" Pod name: virt-handler-sn5j5 Pod phase: Running level=info timestamp=2018-08-02T10:15:46.461368Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipqrhb/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmipqrhb" level=info timestamp=2018-08-02T10:15:46.818727Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmip7gmx, existing: true\n" level=info timestamp=2018-08-02T10:15:46.818816Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:15:46.818858Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:15:46.818963Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmip7gmx kind= uid=c974ae42-963c-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:15:46.819179Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmip7gmx kind= uid=c974ae42-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmip7gmx/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmip7gmx/sock" level=error timestamp=2018-08-02T10:15:46.819267Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmip7gmx kind= uid=c974ae42-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmip7gmx/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:15:46.819350Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmip7gmx/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmip7gmx" level=info timestamp=2018-08-02T10:15:47.480455Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi88d94, existing: true\n" level=info timestamp=2018-08-02T10:15:47.480540Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:15:47.480574Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:15:47.480668Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi88d94 kind= uid=c970bb0d-963c-11e8-8bc5-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:15:47.480983Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi88d94 kind= uid=c970bb0d-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi88d94/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi88d94/sock" level=error timestamp=2018-08-02T10:15:47.481057Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi88d94 kind= uid=c970bb0d-963c-11e8-8bc5-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi88d94/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:15:47.481140Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi88d94/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi88d94" Pod name: virt-launcher-testvmi88d94-8x62h Pod phase: Running level=info timestamp=2018-08-02T10:14:11.629670Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:14:11.630439Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:14:11.633016Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:14:24.649515Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:14:24.673572Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi88d94" level=info timestamp=2018-08-02T10:14:24.675773Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:14:24.675998Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmip7gmx-k5mbr Pod phase: Running level=info timestamp=2018-08-02T10:14:12.411486Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:14:12.411754Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:14:12.417607Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:14:24.219574Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:14:24.241916Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmip7gmx" level=info timestamp=2018-08-02T10:14:24.243393Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:14:24.243600Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmipqrhb-v4jj6 Pod phase: Running level=info timestamp=2018-08-02T10:14:11.110111Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:14:11.112807Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:14:11.114571Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:14:23.075197Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:14:23.145300Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmipqrhb" level=info timestamp=2018-08-02T10:14:23.157175Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:14:23.157391Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmitkfl6-5b7nm Pod phase: Running level=info timestamp=2018-08-02T10:14:09.760283Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:14:09.760557Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:14:09.772242Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:14:19.781390Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:14:19.802955Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmitkfl6" level=info timestamp=2018-08-02T10:14:19.805066Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:14:19.805245Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [181.487 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 VirtualMachineInstance with default interface model [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:382 should expose the right device type to the guest /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:383 Unexpected Warning event received: testvmi88d94,c970bb0d-963c-11e8-8bc5-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmi88d94/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ level=info timestamp=2018-08-02T10:14:06.521290Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi88d94 kind=VirtualMachineInstance uid=c970bb0d-963c-11e8-8bc5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi88d94-8x62h" level=info timestamp=2018-08-02T10:14:26.037094Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi88d94 kind=VirtualMachineInstance uid=c970bb0d-963c-11e8-8bc5-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi88d94-8x62h" level=error timestamp=2018-08-02T10:14:26.414432Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi88d94 kind=VirtualMachineInstance uid=c970bb0d-963c-11e8-8bc5-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi88d94/sock: connect: no such file or directory" Received interrupt. Emitting contents of GinkgoWriter... --------------------------------------------------------- level=info timestamp=2018-08-02T10:17:08.079286Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmigspr2 kind=VirtualMachineInstance uid=359ddf71-963d-11e8-8bc5-525500d15501 msg="Created virtual machine pod virt-launcher-testvmigspr2-6kckm" --------------------------------------------------------- Received interrupt. Running AfterSuite... ^C again to terminate immediately