Discussion:
[ovirt-users] Diary of hosted engine install woes
m***@brendanh.com
2018-10-07 22:03:21 UTC
Permalink
Hi,

I've been attempting to install hosted engine intermittently for months without success. I have raised and had one bug fixed (https://bugzilla.redhat.com/show_bug.cgi?id=1622240) and am hitting various other problems. I've freshly installed oVirt Node v4.2.7 (Second Release Candidate). I have launched the webui and configured an XOR bond. Then using the following answer file, I attempt to install hosted engine using command:

hosted-engine --deploy --config-append=/var/lib/ovirt-hosted-engine-setup/answers/answers.conf

Where answers contains:
[environment:default]
OVEHOSTED_CORE/deployProceed=bool:True
OVEHOSTED_CORE/rollbackProceed=none:None
OVEHOSTED_CORE/screenProceed=bool:True
OVEHOSTED_CORE/upgradeProceed=none:None
OVEHOSTED_ENGINE/clusterName=str:Default
OVEHOSTED_ENGINE/enableHcGlusterService=none:None
OVEHOSTED_ENGINE/insecureSSL=none:None
OVEHOSTED_NETWORK/bridgeName=str:ovirtmgmt
OVEHOSTED_NETWORK/fqdn=str:ovirt-engine.example.com
OVEHOSTED_NETWORK/gateway=str:10.0.0.1
OVEHOSTED_NOTIF/destEmail=str:***@example.com
OVEHOSTED_NOTIF/smtpPort=str:25
OVEHOSTED_NOTIF/smtpServer=str:smtp.emailprovider.com
OVEHOSTED_NOTIF/sourceEmail=str:***@example.com
OVEHOSTED_STORAGE/LunID=none:None
OVEHOSTED_STORAGE/discardSupport=bool:False
OVEHOSTED_STORAGE/domainType=str:nfs
OVEHOSTED_STORAGE/iSCSIDiscoverUser=none:None
OVEHOSTED_STORAGE/iSCSIPortal=none:None
OVEHOSTED_STORAGE/iSCSIPortalIPAddress=none:None
OVEHOSTED_STORAGE/iSCSIPortalPort=none:None
OVEHOSTED_STORAGE/iSCSIPortalUser=none:None
OVEHOSTED_STORAGE/iSCSITargetName=none:None
OVEHOSTED_STORAGE/imgSizeGB=str:50
OVEHOSTED_STORAGE/imgUUID=str:4ff5a24d-7993-4be3-9301-e829030f9dc1
OVEHOSTED_STORAGE/lockspaceImageUUID=none:None
OVEHOSTED_STORAGE/lockspaceVolumeUUID=none:None
OVEHOSTED_STORAGE/metadataImageUUID=none:None
OVEHOSTED_STORAGE/metadataVolumeUUID=none:None
OVEHOSTED_STORAGE/mntOptions=str:
OVEHOSTED_STORAGE/nfsVersion=str:auto
OVEHOSTED_STORAGE/storageDomainConnection=str:nas.example.com:/export/ovirt_share
OVEHOSTED_STORAGE/storageDomainName=str:hosted_storage
OVEHOSTED_STORAGE/volUUID=str:b8e5d967-4ffc-452d-8001-fa9369a6a11b
OVEHOSTED_VM/automateVMShutdown=bool:True
OVEHOSTED_VM/cdromUUID=str:c4b91916-c96b-47cd-a945-5d662ea8fc5d
OVEHOSTED_VM/cloudInitISO=str:generate
OVEHOSTED_VM/cloudinitExecuteEngineSetup=bool:True
OVEHOSTED_VM/cloudinitInstanceDomainName=str:example.com
OVEHOSTED_VM/cloudinitInstanceHostName=str:ovirt-engine.example.com
OVEHOSTED_VM/cloudinitVMDNS=bool:False
OVEHOSTED_VM/cloudinitVMETCHOSTS=bool:False
OVEHOSTED_VM/cloudinitVMStaticCIDR=bool:False
OVEHOSTED_VM/cloudinitVMTZ=str:Europe/London
OVEHOSTED_VM/consoleUUID=str:24d13b14-cc56-420a-966c-f0d7341cb19b
OVEHOSTED_VM/emulatedMachine=str:pc
OVEHOSTED_VM/nicUUID=str:77aa84f1-744b-45cc-a298-3905acda6944
OVEHOSTED_VM/ovfArchive=str:
OVEHOSTED_VM/rootSshAccess=str:yes
OVEHOSTED_VM/rootSshPubkey=str:<public_key>
OVEHOSTED_VM/vmCDRom=none:None
OVEHOSTED_VM/vmMACAddr=str:00:18:3d:5b:11:5c
OVEHOSTED_VM/vmMemSizeMB=int:16384
OVEHOSTED_VM/vmVCpus=str:4

After prompting for a couple of passwords it proceeds as far as Task: "Clean /etc/hosts on the host". Then errors:
[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n"}

Here is the hosted-engine-setup log:
Loaded plugins: fastestmirror, product-id, subscription-manager
This system is not registered with an entitlement server. You can use subscription-manager to register.
Loaded plugins: fastestmirror, product-id, subscription-manager
This system is not registered with an entitlement server. You can use subscription-manager to register.
2018-10-07 20:47:52,180+0100 INFO otopi.context context.runSequence:741 Stage: Initializing
2018-10-07 20:47:52,215+0100 INFO otopi.context context.runSequence:741 Stage: Environment setup
2018-10-07 20:47:52,449+0100 INFO otopi.context context.runSequence:741 Stage: Environment packages setup
Loaded plugins: fastestmirror, product-id, subscription-manager
This system is not registered with an entitlement server. You can use subscription-manager to register.
Loaded plugins: fastestmirror, product-id, subscription-manager
This system is not registered with an entitlement server. You can use subscription-manager to register.
2018-10-07 20:47:52,636+0100 INFO otopi.context context.runSequence:741 Stage: Programs detection
LANG=en_GB.UTF-8
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin

LANG=en_GB.UTF-8
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin

2018-10-07 20:47:52,778+0100 INFO otopi.context context.runSequence:741 Stage: Environment setup
2018-10-07 20:47:52,787+0100 INFO otopi.context context.runSequence:741 Stage: Environment customization
LoadState=loaded

PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data.
64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.283 ms

--- 10.0.0.1 ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 0.283/0.283/0.283/0.000 ms

2018-10-07 20:47:54,133+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Gathering Facts]
2018-10-07 20:47:55,336+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:47:55,738+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Detecting interface on existing management bridge]
2018-10-07 20:47:56,239+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:47:57,544+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Get all active network interfaces]
2018-10-07 20:48:00,250+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Filter bonds with bad naming]
2018-10-07 20:48:01,855+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Generate output list]
2018-10-07 20:48:02,256+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
21: bond0: <BROADCAST,MULTICAST,MASTER,UP,LOWER_UP> mtu 1500 qdisc noqueue state UP group default qlen 1000
link/ether d8:cb:8c:4e:cd:12 brd ff:ff:ff:ff:ff:ff
inet 10.0.0.171/24 brd 10.0.0.255 scope global noprefixroute bond0
valid_lft forever preferred_lft forever
inet6 fe80::d233:e638:11a5:1fc0/64 scope link noprefixroute
valid_lft forever preferred_lft forever


21: bond0: <BROADCAST,MULTICAST,MASTER,UP,LOWER_UP> mtu 1500 qdisc noqueue state UP group default qlen 1000
link/ether d8:cb:8c:4e:cd:12 brd ff:ff:ff:ff:ff:ff
inet 10.0.0.171/24 brd 10.0.0.255 scope global noprefixroute bond0
valid_lft forever preferred_lft forever
inet6 fe80::d233:e638:11a5:1fc0/64 scope link noprefixroute
valid_lft forever preferred_lft forever

1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000
link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
inet 127.0.0.1/8 scope host lo
valid_lft forever preferred_lft forever
inet6 ::1/128 scope host
valid_lft forever preferred_lft forever
2: ens6: <BROADCAST,MULTICAST,SLAVE,UP,LOWER_UP> mtu 1500 qdisc pfifo_fast master bond0 state UP group default qlen 1000
link/ether d8:cb:8c:4e:cd:12 brd ff:ff:ff:ff:ff:ff
3: enp6s0: <BROADCAST,MULTICAST,SLAVE,UP,LOWER_UP> mtu 1500 qdisc mq master bond0 state UP group default qlen 1000
link/ether d8:cb:8c:4e:cd:12 brd ff:ff:ff:ff:ff:ff
19: virbr0: <NO-CARRIER,BROADCAST,MULTICAST,UP> mtu 1500 qdisc noqueue state DOWN group default qlen 1000
link/ether 52:54:00:7d:cb:57 brd ff:ff:ff:ff:ff:ff
inet 192.168.124.1/24 brd 192.168.124.255 scope global virbr0
valid_lft forever preferred_lft forever
20: virbr0-nic: <NO-CARRIER,BROADCAST,MULTICAST,UP> mtu 1500 qdisc pfifo_fast master virbr0 state DOWN group default qlen 1000
link/ether 52:54:00:7d:cb:57 brd ff:ff:ff:ff:ff:ff
21: bond0: <BROADCAST,MULTICAST,MASTER,UP,LOWER_UP> mtu 1500 qdisc noqueue state UP group default qlen 1000
link/ether d8:cb:8c:4e:cd:12 brd ff:ff:ff:ff:ff:ff
inet 10.0.0.171/24 brd 10.0.0.255 scope global noprefixroute bond0
valid_lft forever preferred_lft forever
inet6 fe80::d233:e638:11a5:1fc0/64 scope link noprefixroute
valid_lft forever preferred_lft forever

2018-10-07 20:48:21,545+0100 INFO otopi.context context.runSequence:741 Stage: Setup validation
1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000
link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
inet 127.0.0.1/8 scope host lo
valid_lft forever preferred_lft forever
inet6 ::1/128 scope host
valid_lft forever preferred_lft forever
2: ens6: <BROADCAST,MULTICAST,SLAVE,UP,LOWER_UP> mtu 1500 qdisc pfifo_fast master bond0 state UP group default qlen 1000
link/ether d8:cb:8c:4e:cd:12 brd ff:ff:ff:ff:ff:ff
3: enp6s0: <BROADCAST,MULTICAST,SLAVE,UP,LOWER_UP> mtu 1500 qdisc mq master bond0 state UP group default qlen 1000
link/ether d8:cb:8c:4e:cd:12 brd ff:ff:ff:ff:ff:ff
19: virbr0: <NO-CARRIER,BROADCAST,MULTICAST,UP> mtu 1500 qdisc noqueue state DOWN group default qlen 1000
link/ether 52:54:00:7d:cb:57 brd ff:ff:ff:ff:ff:ff
inet 192.168.124.1/24 brd 192.168.124.255 scope global virbr0
valid_lft forever preferred_lft forever
20: virbr0-nic: <NO-CARRIER,BROADCAST,MULTICAST,UP> mtu 1500 qdisc pfifo_fast master virbr0 state DOWN group default qlen 1000
link/ether 52:54:00:7d:cb:57 brd ff:ff:ff:ff:ff:ff
21: bond0: <BROADCAST,MULTICAST,MASTER,UP,LOWER_UP> mtu 1500 qdisc noqueue state UP group default qlen 1000
link/ether d8:cb:8c:4e:cd:12 brd ff:ff:ff:ff:ff:ff
inet 10.0.0.171/24 brd 10.0.0.255 scope global noprefixroute bond0
valid_lft forever preferred_lft forever
inet6 fe80::d233:e638:11a5:1fc0/64 scope link noprefixroute
valid_lft forever preferred_lft forever

2018-10-07 20:48:21,610+0100 WARNING otopi.plugins.gr_he_common.network.bridge hostname._validateFQDN:387 Host name host has no domain suffix
host. 0 IN A 10.0.0.171

1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000
link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
inet 127.0.0.1/8 scope host lo
valid_lft forever preferred_lft forever
inet6 ::1/128 scope host
valid_lft forever preferred_lft forever
2: ens6: <BROADCAST,MULTICAST,SLAVE,UP,LOWER_UP> mtu 1500 qdisc pfifo_fast master bond0 state UP group default qlen 1000
link/ether d8:cb:8c:4e:cd:12 brd ff:ff:ff:ff:ff:ff
3: enp6s0: <BROADCAST,MULTICAST,SLAVE,UP,LOWER_UP> mtu 1500 qdisc mq master bond0 state UP group default qlen 1000
link/ether d8:cb:8c:4e:cd:12 brd ff:ff:ff:ff:ff:ff
19: virbr0: <NO-CARRIER,BROADCAST,MULTICAST,UP> mtu 1500 qdisc noqueue state DOWN group default qlen 1000
link/ether 52:54:00:7d:cb:57 brd ff:ff:ff:ff:ff:ff
inet 192.168.124.1/24 brd 192.168.124.255 scope global virbr0
valid_lft forever preferred_lft forever
20: virbr0-nic: <NO-CARRIER,BROADCAST,MULTICAST,UP> mtu 1500 qdisc pfifo_fast master virbr0 state DOWN group default qlen 1000
link/ether 52:54:00:7d:cb:57 brd ff:ff:ff:ff:ff:ff
21: bond0: <BROADCAST,MULTICAST,MASTER,UP,LOWER_UP> mtu 1500 qdisc noqueue state UP group default qlen 1000
link/ether d8:cb:8c:4e:cd:12 brd ff:ff:ff:ff:ff:ff
inet 10.0.0.171/24 brd 10.0.0.255 scope global noprefixroute bond0
valid_lft forever preferred_lft forever
inet6 fe80::d233:e638:11a5:1fc0/64 scope link noprefixroute
valid_lft forever preferred_lft forever


2018-10-07 20:48:21,693+0100 INFO otopi.context context.runSequence:741 Stage: Transaction setup
Loaded plugins: fastestmirror, product-id, subscription-manager
This system is not registered with an entitlement server. You can use subscription-manager to register.
2018-10-07 20:48:21,785+0100 INFO otopi.context context.runSequence:741 Stage: Misc configuration
2018-10-07 20:48:21,787+0100 INFO otopi.context context.runSequence:741 Stage: Package installation
2018-10-07 20:48:21,823+0100 INFO otopi.context context.runSequence:741 Stage: Misc configuration
2018-10-07 20:48:21,839+0100 INFO otopi.context context.runSequence:741 Stage: Transaction commit
Loaded plugins: fastestmirror, product-id, subscription-manager
This system is not registered with an entitlement server. You can use subscription-manager to register.
2018-10-07 20:48:21,885+0100 INFO otopi.context context.runSequence:741 Stage: Closing up
2018-10-07 20:48:21,886+0100 INFO otopi.plugins.gr_he_ansiblesetup.core.misc misc.initial_clean_up:229 Cleaning previous attempts
2018-10-07 20:48:22,895+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Gathering Facts]
2018-10-07 20:48:23,998+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:48:24,500+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Check firewalld status]
2018-10-07 20:48:25,402+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:48:25,904+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Enforce firewalld status]
2018-10-07 20:48:26,306+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:48:26,707+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Stop libvirt service]
2018-10-07 20:48:27,610+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:48:28,112+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Drop vdsm config statements]
2018-10-07 20:48:30,216+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Restore initial abrt config files]
2018-10-07 20:48:33,823+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Restart abrtd service]
2018-10-07 20:48:34,725+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:48:35,127+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Drop libvirt sasl2 configuration by vdsm]
2018-10-07 20:48:35,829+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:48:36,230+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Stop and disable services]
2018-10-07 20:48:38,334+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Start libvirt]
2018-10-07 20:48:39,337+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:48:39,739+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Check for leftover local Hosted Engine VM]
2018-10-07 20:48:40,541+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:48:40,943+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Destroy leftover local Hosted Engine VM]
2018-10-07 20:48:41,344+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:48:41,746+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Check for leftover defined local Hosted Engine VM]
2018-10-07 20:48:42,448+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:48:42,849+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Undefine leftover local engine VM]
2018-10-07 20:48:43,250+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:48:43,652+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Check for leftover defined Hosted Engine VM]
2018-10-07 20:48:44,454+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:48:44,856+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Undefine leftover engine VM]
2018-10-07 20:48:45,157+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:48:45,558+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Remove eventually entries for the local VM from known_hosts file]
2018-10-07 20:48:46,360+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:48:46,663+0100 INFO otopi.plugins.gr_he_ansiblesetup.core.misc misc._closeup:195 Starting local VM
2018-10-07 20:48:47,672+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Gathering Facts]
2018-10-07 20:48:48,775+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:48:49,277+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Start libvirt]
2018-10-07 20:48:50,180+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:48:50,681+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Activate default libvirt network]
2018-10-07 20:48:51,484+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:48:52,687+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Get libvirt interfaces]
2018-10-07 20:48:53,390+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:48:53,691+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Get routing rules]
2018-10-07 20:48:54,493+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:48:55,797+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Save bridge name]
2018-10-07 20:48:56,098+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:48:56,500+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Wait for the bridge to appear on the host]
2018-10-07 20:48:57,102+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:48:57,503+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Refresh network facts]
2018-10-07 20:48:57,905+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:48:58,306+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Prepare CIDR for virbr0]
2018-10-07 20:48:58,708+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:48:59,209+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Add outbound route rules]
2018-10-07 20:48:59,911+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:49:01,115+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Add inbound route rules]
2018-10-07 20:49:01,817+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:49:03,122+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Gathering Facts]
2018-10-07 20:49:05,427+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:49:05,929+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [include_tasks]
2018-10-07 20:49:06,330+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:49:06,731+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Avoid localhost]
2018-10-07 20:49:07,133+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:49:07,434+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Get host address resolution]
2018-10-07 20:49:08,237+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:49:09,440+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Check address resolution]
2018-10-07 20:49:09,842+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:49:10,243+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Parse host address resolution]
2018-10-07 20:49:10,645+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:49:11,849+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Ensure host address resolves locally]
2018-10-07 20:49:12,250+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:49:12,652+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Get target address from selected interface]
2018-10-07 20:49:13,153+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:49:14,458+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Check the resolved address resolves on the selected interface]
2018-10-07 20:49:14,859+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:49:15,261+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Check for alias]
2018-10-07 20:49:15,963+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:49:17,166+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Ensure the resolved address resolves only on the selected interface]
2018-10-07 20:49:17,568+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:49:18,069+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Avoid localhost]
2018-10-07 20:49:18,471+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:49:18,872+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Get engine FQDN resolution]
2018-10-07 20:49:19,474+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:49:20,678+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Check engine FQDN resolution]
2018-10-07 20:49:21,080+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:49:21,481+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Parse engine FQDN resolution]
2018-10-07 20:49:21,983+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:49:23,187+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Ensure engine FQDN doesn't resolve locally]
2018-10-07 20:49:23,588+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:49:23,890+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Check http/https proxy]
2018-10-07 20:49:24,792+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Register the engine FQDN as a host]
2018-10-07 20:49:25,093+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:49:25,495+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Create directory for local VM]
2018-10-07 20:49:26,397+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:49:26,799+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Set local vm dir path]
2018-10-07 20:49:27,200+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:49:27,602+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fix local VM directory permission]
2018-10-07 20:49:28,504+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:49:29,005+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [include_tasks]
2018-10-07 20:49:29,307+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:49:29,808+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Install ovirt-engine-appliance rpm]
2018-10-07 20:53:29,731+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:53:30,234+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Parse appliance configuration for path]
2018-10-07 20:53:30,936+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:53:32,140+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Parse appliance configuration for sha1sum]
2018-10-07 20:53:32,842+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:53:34,045+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Get OVA path]
2018-10-07 20:53:34,447+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:53:35,651+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Compute sha1sum]
2018-10-07 20:53:37,655+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:53:38,959+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Compare sha1sum]
2018-10-07 20:53:39,261+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:53:39,662+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Register appliance PATH]
2018-10-07 20:53:40,063+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:53:41,267+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Extract appliance to local VM directory]
2018-10-07 20:54:37,748+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:54:38,250+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [include_tasks]
2018-10-07 20:54:38,652+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:54:50,775+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Find the local appliance image]
2018-10-07 20:54:51,577+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:54:52,781+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Set local_vm_disk_path]
2018-10-07 20:54:53,182+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:54:53,584+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Get appliance disk size]
2018-10-07 20:54:54,386+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:54:55,590+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Parse qemu-img output]
2018-10-07 20:54:55,992+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:54:57,196+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Create cloud init user-data and meta-data files]
2018-10-07 20:54:59,200+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Create ISO disk]
2018-10-07 20:54:59,902+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:55:00,403+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Create local VM]
2018-10-07 20:55:02,307+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:55:03,611+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Get local VM IP]
2018-10-07 20:55:25,649+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:55:26,953+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Remove eventually entries for the local VM from /etc/hosts]
2018-10-07 20:55:27,755+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:55:28,256+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Create an entry in /etc/hosts for the local VM]
2018-10-07 20:55:28,958+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:55:29,460+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Wait for SSH to restart on the local VM]
2018-10-07 20:56:00,316+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost -> localhost]
2018-10-07 20:56:00,818+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Gathering Facts]
2018-10-07 20:56:03,123+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [ovirt-engine.example.com]
2018-10-07 20:56:03,624+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Wait for the local VM]
2018-10-07 20:56:10,337+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [ovirt-engine.example.com]
2018-10-07 20:56:10,939+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Add an entry for this host on /etc/hosts on the local VM]
2018-10-07 20:56:12,443+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [ovirt-engine.example.com]
2018-10-07 20:56:12,944+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Set FQDN]
2018-10-07 20:56:14,548+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [ovirt-engine.example.com]
2018-10-07 20:56:15,050+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Force the local VM FQDN to temporary resolve on the natted network address]
2018-10-07 20:56:16,654+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [ovirt-engine.example.com]
2018-10-07 20:56:17,155+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Restore sshd reverse DNS lookups]
2018-10-07 20:56:18,659+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [ovirt-engine.example.com]
2018-10-07 20:56:19,261+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Generate an answer file for engine-setup]
2018-10-07 20:56:21,665+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [ovirt-engine.example.com]
2018-10-07 20:56:22,167+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Include before engine-setup custom tasks files for the engine VM]
2018-10-07 20:56:23,972+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Execute engine-setup]
2018-10-07 20:57:52,926+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [ovirt-engine.example.com]
2018-10-07 20:57:54,430+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Include after engine-setup custom tasks files for the engine VM]
2018-10-07 20:57:56,234+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Configure LibgfApi support]
2018-10-07 20:57:56,735+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [ovirt-engine.example.com]
2018-10-07 20:57:58,139+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Restart ovirt-engine service for LibgfApi support]
2018-10-07 20:57:58,540+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [ovirt-engine.example.com]
2018-10-07 20:57:59,944+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Mask cloud-init services to speed up future boot]
2018-10-07 20:58:02,849+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Clean up bootstrap answer file]
2018-10-07 20:58:04,252+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [ovirt-engine.example.com]
2018-10-07 20:58:04,854+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Gathering Facts]
2018-10-07 20:58:06,157+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:58:06,759+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Wait for ovirt-engine service to start]
2018-10-07 20:58:07,962+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:58:09,767+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Detect VLAN ID]
2018-10-07 20:58:10,569+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:58:12,174+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Set Engine public key as authorized key without validating the TLS/SSL certificates]
2018-10-07 20:58:13,276+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:58:13,878+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [include_tasks]
2018-10-07 20:58:14,380+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:58:14,982+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Obtain SSO token using username/password credentials]
2018-10-07 20:58:16,285+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 20:58:16,987+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Enable GlusterFS at cluster level]
2018-10-07 20:58:17,489+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:58:17,990+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Set VLAN ID at datacenter level]
2018-10-07 20:58:18,492+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 20:58:19,094+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Force host-deploy in offline mode]
2018-10-07 20:58:20,096+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:58:20,598+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Add host]
2018-10-07 20:58:22,401+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 20:58:23,004+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Wait for the host to be up]
2018-10-07 22:00:02,131+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:00:03,835+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Check host status]
2018-10-07 22:00:04,336+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 22:00:04,937+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Remove host-deploy configuration file]
2018-10-07 22:00:05,639+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]

2018-10-07 22:00:06,051+0100 INFO otopi.plugins.gr_he_ansiblesetup.core.storage_domain storage_domain._closeup:751 Creating Storage Domain
2018-10-07 22:00:06,856+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Gathering Facts]
2018-10-07 22:00:07,658+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:00:08,159+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Check local VM dir stat]
2018-10-07 22:00:08,860+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:00:09,361+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Enforce local VM dir existence]
2018-10-07 22:00:09,863+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 22:00:10,364+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [include_tasks]
2018-10-07 22:00:10,865+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:00:11,366+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Obtain SSO token using username/password credentials]
2018-10-07 22:00:12,268+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:00:12,769+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fetch host facts]
2018-10-07 22:00:13,671+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:00:15,274+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fetch cluster ID]
2018-10-07 22:00:15,775+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:00:16,276+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fetch cluster facts]
2018-10-07 22:00:17,178+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:00:18,781+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fetch Datacenter facts]
2018-10-07 22:00:19,683+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:00:21,286+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fetch Datacenter ID]
2018-10-07 22:00:21,787+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:00:22,389+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fetch Datacenter name]
2018-10-07 22:00:22,790+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:00:23,391+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Add NFS storage domain]
2018-10-07 22:00:24,994+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:00:25,595+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Add glusterfs storage domain]
2018-10-07 22:00:26,096+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 22:00:26,698+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Add iSCSI storage domain]
2018-10-07 22:00:27,199+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 22:00:27,700+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Add Fibre Channel storage domain]
2018-10-07 22:00:28,201+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 22:00:28,802+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Get storage domain details]
2018-10-07 22:00:29,704+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:00:31,307+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Find the appliance OVF]
2018-10-07 22:00:32,109+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:00:33,812+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Parse OVF]
2018-10-07 22:00:34,614+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:00:35,115+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Get required size]
2018-10-07 22:00:35,716+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:00:37,320+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Remove unsuitable storage domain]
2018-10-07 22:00:37,821+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 22:00:39,424+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Check storage domain free space]
2018-10-07 22:00:39,925+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 22:00:40,526+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Activate storage domain]
2018-10-07 22:04:13,748+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:04:15,258+0100 INFO otopi.plugins.gr_he_ansiblesetup.core.target_vm target_vm._closeup:213 Creating Target VM
2018-10-07 22:04:16,064+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Gathering Facts]
2018-10-07 22:04:16,965+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:04:17,467+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Register the engine FQDN as a host]
2018-10-07 22:04:17,968+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:04:18,469+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [include_tasks]
2018-10-07 22:04:18,870+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:04:19,472+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Obtain SSO token using username/password credentials]
2018-10-07 22:04:20,373+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:04:20,874+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Get local VM IP]
2018-10-07 22:04:21,676+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:04:23,179+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fetch host facts]
2018-10-07 22:04:24,080+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:04:25,683+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fetch Cluster ID]
2018-10-07 22:04:26,184+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:04:26,786+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fetch Cluster facts]
2018-10-07 22:04:27,687+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:04:29,290+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fetch Datacenter facts]
2018-10-07 22:04:30,192+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:04:31,795+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fetch Cluster name]
2018-10-07 22:04:32,296+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:04:32,898+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fetch Datacenter ID]
2018-10-07 22:04:33,399+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:04:34,000+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fetch Datacenter name]
2018-10-07 22:04:34,501+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:04:35,103+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Get Cluster CPU model]
2018-10-07 22:04:35,604+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:04:37,207+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Get storage domain details]
2018-10-07 22:04:38,209+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:04:39,812+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Add HE disks]
2018-10-07 22:06:32,186+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Register disk details]
2018-10-07 22:06:32,788+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:06:34,491+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Add VM]
2018-10-07 22:06:36,595+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:06:38,198+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Register external local VM uuid]
2018-10-07 22:06:39,000+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:06:40,703+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Gathering Facts]
2018-10-07 22:06:41,905+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [ovirt-engine.example.com]
2018-10-07 22:06:42,406+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Find configuration file for SCL PostgreSQL]
2018-10-07 22:06:43,509+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [ovirt-engine.example.com]
2018-10-07 22:06:45,012+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Check SCL PostgreSQL value]
2018-10-07 22:06:46,114+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [ovirt-engine.example.com]
2018-10-07 22:06:47,717+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Update target VM details at DB level]
2018-10-07 22:06:50,523+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Insert Hosted Engine configuration disk uuid into Engine database]
2018-10-07 22:06:51,625+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [ovirt-engine.example.com]
2018-10-07 22:06:53,228+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Disable IPv6]
2018-10-07 22:06:54,330+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [ovirt-engine.example.com]
2018-10-07 22:06:54,931+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Reload sysctl]
2018-10-07 22:06:55,933+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [ovirt-engine.example.com]
2018-10-07 22:06:56,535+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Gathering Facts]
2018-10-07 22:06:57,837+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:06:58,538+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Enable again the serial console device]
2018-10-07 22:06:59,741+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:07:00,442+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Trigger hosted engine OVF update]
2018-10-07 22:07:01,944+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:07:02,646+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Wait until OVF update finishes]
2018-10-07 22:07:03,748+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:07:04,449+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Parse OVF_STORE disk list]
2018-10-07 22:07:05,151+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:07:07,255+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Check OVF_STORE volume status (ansible 2.5)]
2018-10-07 22:07:08,657+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Check OVF_STORE volume status (ansible 2.6)]
2018-10-07 22:07:11,362+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Prepare images]
2018-10-07 22:07:15,970+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fetch Hosted Engine configuration disk path]
2018-10-07 22:07:16,672+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:07:17,373+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fetch Hosted Engine virtio disk path]
2018-10-07 22:07:18,075+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:07:18,776+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fetch Hosted Engine virtio metadata path]
2018-10-07 22:07:19,377+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:07:24,286+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Shutdown local VM]
2018-10-07 22:07:25,288+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:07:26,090+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Wait for local VM shutdown]
2018-10-07 22:09:01,140+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:09:03,144+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Undefine local VM]
2018-10-07 22:09:04,046+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:09:04,747+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Detect spmId]
2018-10-07 22:09:05,749+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:09:07,853+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Parse spmId]
2018-10-07 22:09:08,555+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:09:10,558+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Detect ovirt-hosted-engine-ha version]
2018-10-07 22:09:11,460+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:09:12,162+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Set ha_version]
2018-10-07 22:09:12,763+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:09:14,867+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Create configuration templates]
2018-10-07 22:09:18,172+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Create configuration archive]
2018-10-07 22:09:18,974+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:09:19,676+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Create ovirt-hosted-engine-ha run directory]
2018-10-07 22:09:20,577+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:09:21,379+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Copy configuration files to the right location on host]
2018-10-07 22:09:23,483+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Copy configuration archive to storage]
2018-10-07 22:09:24,685+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:09:25,387+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Initialize metadata volume]
2018-10-07 22:09:36,903+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:09:37,604+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [include_tasks]
2018-10-07 22:09:38,306+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:09:39,107+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Find the local appliance image]
2018-10-07 22:09:40,009+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:09:42,113+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Set local_vm_disk_path]
2018-10-07 22:09:42,714+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:09:43,415+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Generate DHCP network configuration for the engine VM]
2018-10-07 22:09:44,417+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:09:45,119+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Generate static network configuration for the engine VM]
2018-10-07 22:09:45,720+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 skipping: [localhost]
2018-10-07 22:09:46,422+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Inject network configuration with guestfish]
2018-10-07 22:10:01,444+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:10:02,146+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Extract /etc/hosts from the Hosted Engine VM]
2018-10-07 22:10:05,151+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:10:05,953+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Clean /etc/hosts for the Hosted Engine VM for Engine VM FQDN]
2018-10-07 22:10:06,754+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:10:07,455+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Clean /etc/hosts for the Hosted Engine VM for host address]
2018-10-07 22:10:08,357+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:10:09,059+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Copy /etc/hosts back to the Hosted Engine VM]
2018-10-07 22:10:12,364+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:10:13,065+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Copy local VM disk to shared storage]
2018-10-07 22:10:43,511+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:10:44,212+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Clean /etc/hosts on the host]
2018-10-07 22:10:44,814+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 {u'msg': u"The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n", u'_ansible_no_log': False}
2018-10-07 22:10:44,914+0100 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:98 fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n"}
2018-10-07 22:10:45,415+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:180 ansible-playbook rc: 2
2018-10-07 22:10:45,415+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 PLAY RECAP [localhost] : ok: 70 changed: 24 unreachable: 0 skipped: 3 failed: 1
2018-10-07 22:10:45,415+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 PLAY RECAP [ovirt-engine.example.com] : ok: 11 changed: 6 unreachable: 0 skipped: 0 failed: 0
2018-10-07 22:10:45,416+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:187 ansible-playbook stdout:
2018-10-07 22:10:45,416+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:189 to retry, use: --limit @/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.retry

2018-10-07 22:10:45,416+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:190 ansible-playbook stderr:
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/otopi/context.py", line 133, in _executeMethod
method['method']()
File "/usr/share/ovirt-hosted-engine-setup/scripts/../plugins/gr-he-ansiblesetup/core/target_vm.py", line 214, in _closeup
r = ah.run()
File "/usr/lib/python2.7/site-packages/ovirt_hosted_engine_setup/ansible_utils.py", line 194, in run
raise RuntimeError(_('Failed executing ansible-playbook'))
RuntimeError: Failed executing ansible-playbook
2018-10-07 22:10:45,418+0100 ERROR otopi.context context._executeMethod:152 Failed to execute stage 'Closing up': Failed executing ansible-playbook
2018-10-07 22:10:45,419+0100 INFO otopi.context context.runSequence:741 Stage: Clean up
2018-10-07 22:10:45,420+0100 INFO otopi.plugins.gr_he_ansiblesetup.core.misc misc._cleanup:246 Cleaning temporary resources
2018-10-07 22:10:46,225+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Gathering Facts]
2018-10-07 22:10:46,927+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:10:47,528+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fetch logs from the engine VM]
2018-10-07 22:10:47,929+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:10:48,430+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Set destination directory path]
2018-10-07 22:10:48,931+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:10:49,332+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Create destination directory]
2018-10-07 22:10:50,134+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]
2018-10-07 22:10:50,535+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [include_tasks]
2018-10-07 22:10:51,036+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:10:51,537+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Find the local appliance image]
2018-10-07 22:10:52,239+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:10:53,641+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Set local_vm_disk_path]
2018-10-07 22:10:54,042+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:10:54,544+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Give the vm time to flush dirty buffers]
2018-10-07 22:11:04,959+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:11:05,460+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Copy engine logs]
2018-10-07 22:11:11,269+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [include_tasks]
2018-10-07 22:11:11,670+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]
2018-10-07 22:11:12,171+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Remove local vm dir]
2018-10-07 22:11:13,173+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]

2018-10-07 22:11:14,403+0100 INFO otopi.context context.runSequence:741 Stage: Termination
2018-10-07 22:11:14,404+0100 ERROR otopi.plugins.gr_he_common.core.misc misc._terminate:240 Hosted Engine deployment failed: please check the logs for the issue, fix accordingly or re-deploy from scratch.


For brevity, I've trimmed the debug lines from the above except those around the error "task includes an option with an undefined variable". Any ideas? I have assigned engine a DHCP address and can resolve its name to a natted IP: 192.168.124.51.

I run the same command again in case this is an anomaly:
hosted-engine --deploy --config-append=/var/lib/ovirt-hosted-engine-setup/answers/answers.conf

This time, it doesn't get that far, proceeds as far as:
[ INFO ] TASK [Activate storage domain]
[ ERROR ] Error: Fault reason is "Operation Failed". Fault detail is "[]". HTTP response code is 400.
[ ERROR ] fatal: [localhost]: FAILED! => {"changed": false, "msg": "Fault reason is \"Operation Failed\". Fault detail is \"[]\". HTTP response code is 400."}
[ ERROR ] Failed to execute stage 'Closing up': Failed executing ansible-playbook

The hosted-engine log this time shows:
2018-10-07 22:54:43,619+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Activate storage domain]
2018-10-07 22:54:45,022+0100 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:98 Error: Fault reason is "Operation Failed". Fault detail is "[]". HTTP response code is 400.
2018-10-07 22:54:45,122+0100 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:98 fatal: [localhost]: FAILED! => {"changed": false, "msg": "Fault reason is \"Operation Failed\". Fault detail is \"[]\". HTTP response code is 400."}

Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/otopi/context.py", line 133, in _executeMethod
method['method']()
File "/usr/share/ovirt-hosted-engine-setup/scripts/../plugins/gr-he-ansiblesetup/core/storage_domain.py", line 756, in _closeup
raise e
RuntimeError: Failed executing ansible-playbook


I've had other errors too, but my own investigations haven't solved. Any ideas much appreciated.
Many thanks.
_______________________________________________
Users mailing list -- ***@ovirt.org
To unsubscribe send an email to users-***@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: https://www.ovirt.org/community/about/community-guidelines/
List Archives: https://lists.ovirt.org/archives/list/***@ovirt.org/message/JDSQI57FSPQ3QNY
m***@brendanh.com
2018-10-07 23:39:24 UTC
Permalink
Okay, the above failure at "TASK [Clean /etc/hosts on the host]" seems to have been due to the fact I hadn't emptied the shared NFS storage after the last attempt. After fresh-installing node and emptying (rm -rf *), it proceeded a few steps further to:

[ INFO ] TASK [Clean /etc/hosts on the host]
[ INFO ] changed: [localhost]
[ INFO ] TASK [Add an entry in /etc/hosts for the target VM]
[ INFO ] skipping: [localhost]
[ INFO ] TASK [Start ovirt-ha-broker service on the host]
[ INFO ] changed: [localhost]
[ INFO ] TASK [Initialize lockspace volume]
[ INFO ] changed: [localhost]
[ INFO ] TASK [Start ovirt-ha-agent service on the host]
[ INFO ] changed: [localhost]
[ INFO ] TASK [Check engine VM health]

Then hung. Eventually I killed it by pressing ctrl-C. Ran the usual install again (hosted-engine --deploy
--config-append=/var/lib/ovirt-hosted-engine-setup/answers/answers.conf) in case of anomaly. This time only got as far as:
[ INFO ] TASK [Create local VM]
[ ERROR ] fatal: [localhost]: FAILED! => {"changed": true, "cmd": ["virt-install", "-n", "HostedEngineLocal", "--os-variant", "rhel7", "--virt-type", "kvm", "--memory", "16384", "--vcpus", "4", "--network", "network=default,mac=00:16:3e:0a:91:2c,model=virtio", "--disk", "/var/tmp/localvmhRtN_6/images/65f7f081-4d9e-43ae-926f-25807f075f1d/a0a00e73-d3ea-4b9b-bd26-06fe189931f2", "--import", "--disk", "path=/var/tmp/localvmhRtN_6/seed.iso,device=cdrom", "--noautoconsole", "--rng", "/dev/random", "--graphics", "vnc", "--video", "vga", "--sound", "none", "--controller", "usb,model=none", "--memballoon", "none", "--boot", "hd,menu=off", "--clock", "kvmclock_present=yes"], "delta": "0:00:00.512265", "end": "2018-10-08 00:33:59.632317", "msg": "non-zero return code", "rc": 1, "start": "2018-10-08 00:33:59.120052", "stderr": "ERROR The MAC address '00:16:3e:0a:91:2c' is in use by another virtual machine.\nDomain installation does not appear to have been successful.\nIf it was, you can restar
t your domain by running:\n virsh --connect qemu:///system start HostedEngineLocal\notherwise, please restart your installation.", "stderr_lines": ["ERROR The MAC address '00:16:3e:0a:91:2c' is in use by another virtual machine.", "Domain installation does not appear to have been successful.", "If it was, you can restart your domain by running:", " virsh --connect qemu:///system start HostedEngineLocal", "otherwise, please restart your installation."], "stdout": "\nStarting install...", "stdout_lines": ["", "Starting install..."]}
_______________________________________________
Users mailing list -- ***@ovirt.org
To unsubscribe send an email to users-***@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: https://www.ovirt.org/community/about/community-guidelines/
List Archives: https://lists.ovirt.org/archives/list/***@ovirt.org/messa
m***@brendanh.com
2018-10-08 09:29:32 UTC
Permalink
Fresh installed again. After boot, did not configure bond this time. Attempted install. Chose the first NIC, got as far as:
[ INFO ] TASK [Check the resolved address resolves on the selected interface]
[ ERROR ] fatal: [localhost]: FAILED! => {"changed": false, "msg": "The resolved address doesn't resolve on the selected interface\n"}

So ran again, this time choosing the second of the two NICs, got further. Failed at:
[ INFO ] TASK [Install ovirt-engine-appliance rpm]
[ ERROR ] fatal: [localhost]: FAILED! => {"changed": false, "msg": "http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Connection timed out after 30001 milliseconds')\nTrying other mirror.\nhttp://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Connection timed out after 30001 milliseconds')\nTrying other mirror.\nhttp://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Connection timed out after 30001 millisecond
s')\nTrying other mirror.\nhttp://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Connection timed out after 30001 milliseconds')\nTrying other mirror.\nhttp://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Connection timed out after 30001 milliseconds')\nTrying other mirror.\nhttp://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Operation too slow. Less than 1000 bytes/sec transferred the last 30 seconds')\nTryi
ng other mirror.\nhttp://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Connection timed out after 30001 milliseconds')\nTrying other mirror.\nhttp://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Connection timed out after 30001 milliseconds')\nTrying other mirror.\nhttp://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Connection timed out after 30001 milliseconds')\nTrying other mirror.\nhttp://resources.ovirt
.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Connection timed out after 30001 milliseconds')\nTrying other mirror.\n\n\nError downloading packages:\n ovirt-engine-appliance-4.2-20180903.1.el7.noarch: [Errno 256] No more mirrors to try.\n\nCannot upload enabled repos report, is this client registered?\n", "rc": 1, "results": ["Loaded plugins: enabled_repos_upload, fastestmirror, imgbased-persist,\n : package_upload, product-id, search-disabled-repos, subscription-\n : manager, vdsmupgrade\nThis system is not registered with an entitlement server. You can use subscription-manager to register.\nLoading mirror speeds from cached hostfile\n * ovirt-4.2-epel: mirror.vorboss.net\nResolving Dependencies\n--> Running transaction check\n---> Package ovirt-engine-appliance.noarch 0:4.2-201
80903.1.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nInstalling:\n ovirt-engine-appliance noarch 4.2-20180903.1.el7 ovirt-4.2 992 M\n\nTransaction Summary\n================================================================================\nInstall 1 Package\n\nTotal download size: 992 M\nInstalled size: 992 M\nDownloading packages:\nUploading Enabled Repositories Report\nLoaded plugins: fastestmirror, product-id, subscription-manager\nThis system is not registered with an entitlement server. You can use subscription-manager to register.\n"]}

Possibly a temporary internet problem. Tried again. Same error. Internet is working fine from host.
_______________________________________________
Users mailing list -- ***@ovirt.org
To unsubscribe send an email to users-***@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: https://www.ovirt.org/community/about/community-guidelines/
List Archives: https://lists.ovirt.org/archives/list/***@ovirt.org/message/PXZNA5DE5FXP
Simone Tiraboschi
2018-10-08 10:07:37 UTC
Permalink
You can try manually installing ovirt-engine-appliance via yum

On Mon, Oct 8, 2018 at 11:33 AM <***@brendanh.com> wrote:

> Fresh installed again. After boot, did not configure bond this time.
> Attempted install. Chose the first NIC, got as far as:
> [ INFO ] TASK [Check the resolved address resolves on the selected
> interface]
> [ ERROR ] fatal: [localhost]: FAILED! => {"changed": false, "msg": "The
> resolved address doesn't resolve on the selected interface\n"}
>
> So ran again, this time choosing the second of the two NICs, got further.
> Failed at:
> [ INFO ] TASK [Install ovirt-engine-appliance rpm]
> [ ERROR ] fatal: [localhost]: FAILED! => {"changed": false, "msg": "
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Connection timed out after 30001 milliseconds')\nTrying other
> mirror.\nhttp://
> resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Connection timed out after 30001 milliseconds')\nTrying other
> mirror.\nhttp://
> resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Connection timed out after 30001 millisecond
> s')\nTrying other mirror.\nhttp://
> resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Connection timed out after 30001 milliseconds')\nTrying other
> mirror.\nhttp://
> resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Connection timed out after 30001 milliseconds')\nTrying other
> mirror.\nhttp://
> resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Operation too slow. Less than 1000 bytes/sec transferred the last 30
> seconds')\nTryi
> ng other mirror.\nhttp://
> resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Connection timed out after 30001 milliseconds')\nTrying other
> mirror.\nhttp://
> resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Connection timed out after 30001 milliseconds')\nTrying other
> mirror.\nhttp://
> resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Connection timed out after 30001 milliseconds')\nTrying other
> mirror.\nhttp://resources.ovirt
> .org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Connection timed out after 30001 milliseconds')\nTrying other
> mirror.\n\n\nError downloading packages:\n
> ovirt-engine-appliance-4.2-20180903.1.el7.noarch: [Errno 256] No more
> mirrors to try.\n\nCannot upload enabled repos report, is this client
> registered?\n", "rc": 1, "results": ["Loaded plugins: enabled_repos_upload,
> fastestmirror, imgbased-persist,\n : package_upload,
> product-id, search-disabled-repos, subscription-\n : manager,
> vdsmupgrade\nThis system is not registered with an entitlement server. You
> can use subscription-manager to register.\nLoading mirror speeds from
> cached hostfile\n * ovirt-4.2-epel: mirror.vorboss.net\nResolving
> Dependencies\n--> Running transaction check\n---> Package
> ovirt-engine-appliance.noarch 0:4.2-201
> 80903.1.el7 will be installed\n--> Finished Dependency
> Resolution\n\nDependencies
> Resolved\n\n================================================================================\n
> Package Arch Version Repository
> Size\n================================================================================\nInstalling:\n
> ovirt-engine-appliance noarch 4.2-20180903.1.el7 ovirt-4.2
> 992 M\n\nTransaction
> Summary\n================================================================================\nInstall
> 1 Package\n\nTotal download size: 992 M\nInstalled size: 992 M\nDownloading
> packages:\nUploading Enabled Repositories Report\nLoaded plugins:
> fastestmirror, product-id, subscription-manager\nThis system is not
> registered with an entitlement server. You can use subscription-manager to
> register.\n"]}
>
> Possibly a temporary internet problem. Tried again. Same error.
> Internet is working fine from host.
> _______________________________________________
> Users mailing list -- ***@ovirt.org
> To unsubscribe send an email to users-***@ovirt.org
> Privacy Statement: https://www.ovirt.org/site/privacy-policy/
> oVirt Code of Conduct:
> https://www.ovirt.org/community/about/community-guidelines/
> List Archives:
> https://lists.ovirt.org/archives/list/***@ovirt.org/message/PXZNA5DE5FXPF5436KT37DCWPDERI57Z/
>
m***@brendanh.com
2018-10-08 14:58:32 UTC
Permalink
Thanks for the suggestion Simone. Tried sudo yum install ovirt-engine-appliance, no joy:
"
=================================================================================================================================
Package Arch Version Repository Size
=================================================================================================================================
Installing:
ovirt-engine-appliance noarch 4.2-20180903.1.el7 ovirt-4.2 992 M

Transaction Summary
=================================================================================================================================
Install 1 Package

Total download size: 992 M
Installed size: 992 M
Is this ok [y/d/N]: y
Downloading packages:
Delta RPMs disabled because /usr/bin/applydeltarpm not installed.
ovirt-engine-appliance-4.2-201 FAILED
http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Operation too slow. Less than 1000 bytes/sec transferred the last 30 seconds')
Trying other mirror.
ovirt-engine-appliance-4.2-201 FAILED
http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Connection timed out after 30001 milliseconds')
Trying other mirror.
ovirt-engine-appliance-4.2-201 FAILED
http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Connection timed out after 30001 milliseconds')
Trying other mirror.
ovirt-engine-appliance-4.2-201 FAILED
http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Connection timed out after 30001 milliseconds')
Trying other mirror.
ovirt-engine-appliance-4.2-201 FAILED ] 1.8 kB/s | 124 MB 134:10:07 ETA
http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Operation too slow. Less than 1000 bytes/sec transferred the last 30 seconds')
Trying other mirror.
ovirt-engine-appliance-4.2-201 FAILED
http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Connection timed out after 30001 milliseconds')
Trying other mirror.
ovirt-engine-appliance-4.2-201 FAILED
http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Connection timed out after 30000 milliseconds')
Trying other mirror.
ovirt-engine-appliance-4.2-201 FAILED
http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Connection timed out after 30000 milliseconds')
Trying other mirror.
ovirt-engine-appliance-4.2-201 FAILED
http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Operation too slow. Less than 1000 bytes/sec transferred the last 30 seconds')
Trying other mirror.
ovirt-engine-appliance-4.2-201 FAILED
http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: [Errno 12] Timeout on http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm: (28, 'Connection timed out after 30001 milliseconds')
Trying other mirror.

Error downloading packages:
ovirt-engine-appliance-4.2-20180903.1.el7.noarch: [Errno 256] No more mirrors to try.
Uploading Enabled Repositories Report
Loaded plugins: fastestmirror, product-id, subscription-manager
This system is not registered with an entitlement server. You can use subscription-manager to register.
Cannot upload enabled repos report, is this client registered?
"


Tried a couple of times, same result.
_______________________________________________
Users mailing list -- ***@ovirt.org
To unsubscribe send an email to users-***@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: https://www.ovirt.org/community/about/community-guidelines/
List Archives: https://lists.ovirt.org/archives/list/***@ovirt.org/mess
Simone Tiraboschi
2018-10-08 15:02:30 UTC
Permalink
On Mon, Oct 8, 2018 at 5:00 PM <***@brendanh.com> wrote:

> Thanks for the suggestion Simone. Tried sudo yum install
> ovirt-engine-appliance, no joy:
>

Can you please run a "yum clean all" and retry?


> "
>
> =================================================================================================================================
> Package Arch Version
> Repository Size
>
> =================================================================================================================================
> Installing:
> ovirt-engine-appliance noarch
> 4.2-20180903.1.el7 ovirt-4.2 992 M
>
> Transaction Summary
>
> =================================================================================================================================
> Install 1 Package
>
> Total download size: 992 M
> Installed size: 992 M
> Is this ok [y/d/N]: y
> Downloading packages:
> Delta RPMs disabled because /usr/bin/applydeltarpm not installed.
> ovirt-engine-appliance-4.2-201 FAILED
>
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Operation too slow. Less than 1000 bytes/sec transferred the last 30
> seconds')
> Trying other mirror.
> ovirt-engine-appliance-4.2-201 FAILED
>
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Connection timed out after 30001 milliseconds')
> Trying other mirror.
> ovirt-engine-appliance-4.2-201 FAILED
>
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Connection timed out after 30001 milliseconds')
> Trying other mirror.
> ovirt-engine-appliance-4.2-201 FAILED
>
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Connection timed out after 30001 milliseconds')
> Trying other mirror.
> ovirt-engine-appliance-4.2-201 FAILED
> ] 1.8 kB/s | 124 MB 134:10:07 ETA
>
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Operation too slow. Less than 1000 bytes/sec transferred the last 30
> seconds')
> Trying other mirror.
> ovirt-engine-appliance-4.2-201 FAILED
>
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Connection timed out after 30001 milliseconds')
> Trying other mirror.
> ovirt-engine-appliance-4.2-201 FAILED
>
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Connection timed out after 30000 milliseconds')
> Trying other mirror.
> ovirt-engine-appliance-4.2-201 FAILED
>
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Connection timed out after 30000 milliseconds')
> Trying other mirror.
> ovirt-engine-appliance-4.2-201 FAILED
>
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Operation too slow. Less than 1000 bytes/sec transferred the last 30
> seconds')
> Trying other mirror.
> ovirt-engine-appliance-4.2-201 FAILED
>
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> [Errno 12] Timeout on
> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
> (28, 'Connection timed out after 30001 milliseconds')
> Trying other mirror.
>
> Error downloading packages:
> ovirt-engine-appliance-4.2-20180903.1.el7.noarch: [Errno 256] No more
> mirrors to try.
> Uploading Enabled Repositories Report
> Loaded plugins: fastestmirror, product-id, subscription-manager
> This system is not registered with an entitlement server. You can use
> subscription-manager to register.
> Cannot upload enabled repos report, is this client registered?
> "
>
>
> Tried a couple of times, same result.
> _______________________________________________
> Users mailing list -- ***@ovirt.org
> To unsubscribe send an email to users-***@ovirt.org
> Privacy Statement: https://www.ovirt.org/site/privacy-policy/
> oVirt Code of Conduct:
> https://www.ovirt.org/community/about/community-guidelines/
> List Archives:
> https://lists.ovirt.org/archives/list/***@ovirt.org/message/A5GX3NQRGRJVKWD3KQFJ7KUZXIVKS5RY/
>
Yuval Turgeman
2018-10-08 15:10:29 UTC
Permalink
For debugging purposes try to download directly with curl/wget and if that
works, I would try to increase the timeout in yum.conf

On Mon, Oct 8, 2018 at 6:05 PM Simone Tiraboschi <***@redhat.com>
wrote:

>
>
> On Mon, Oct 8, 2018 at 5:00 PM <***@brendanh.com> wrote:
>
>> Thanks for the suggestion Simone. Tried sudo yum install
>> ovirt-engine-appliance, no joy:
>>
>
> Can you please run a "yum clean all" and retry?
>
>
>> "
>>
>> =================================================================================================================================
>> Package Arch Version
>> Repository Size
>>
>> =================================================================================================================================
>> Installing:
>> ovirt-engine-appliance noarch
>> 4.2-20180903.1.el7 ovirt-4.2 992 M
>>
>> Transaction Summary
>>
>> =================================================================================================================================
>> Install 1 Package
>>
>> Total download size: 992 M
>> Installed size: 992 M
>> Is this ok [y/d/N]: y
>> Downloading packages:
>> Delta RPMs disabled because /usr/bin/applydeltarpm not installed.
>> ovirt-engine-appliance-4.2-201 FAILED
>>
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> [Errno 12] Timeout on
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> (28, 'Operation too slow. Less than 1000 bytes/sec transferred the last 30
>> seconds')
>> Trying other mirror.
>> ovirt-engine-appliance-4.2-201 FAILED
>>
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> [Errno 12] Timeout on
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> (28, 'Connection timed out after 30001 milliseconds')
>> Trying other mirror.
>> ovirt-engine-appliance-4.2-201 FAILED
>>
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> [Errno 12] Timeout on
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> (28, 'Connection timed out after 30001 milliseconds')
>> Trying other mirror.
>> ovirt-engine-appliance-4.2-201 FAILED
>>
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> [Errno 12] Timeout on
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> (28, 'Connection timed out after 30001 milliseconds')
>> Trying other mirror.
>> ovirt-engine-appliance-4.2-201 FAILED
>> ] 1.8 kB/s | 124 MB 134:10:07 ETA
>>
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> [Errno 12] Timeout on
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> (28, 'Operation too slow. Less than 1000 bytes/sec transferred the last 30
>> seconds')
>> Trying other mirror.
>> ovirt-engine-appliance-4.2-201 FAILED
>>
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> [Errno 12] Timeout on
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> (28, 'Connection timed out after 30001 milliseconds')
>> Trying other mirror.
>> ovirt-engine-appliance-4.2-201 FAILED
>>
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> [Errno 12] Timeout on
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> (28, 'Connection timed out after 30000 milliseconds')
>> Trying other mirror.
>> ovirt-engine-appliance-4.2-201 FAILED
>>
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> [Errno 12] Timeout on
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> (28, 'Connection timed out after 30000 milliseconds')
>> Trying other mirror.
>> ovirt-engine-appliance-4.2-201 FAILED
>>
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> [Errno 12] Timeout on
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> (28, 'Operation too slow. Less than 1000 bytes/sec transferred the last 30
>> seconds')
>> Trying other mirror.
>> ovirt-engine-appliance-4.2-201 FAILED
>>
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> [Errno 12] Timeout on
>> http://resources.ovirt.org/pub/ovirt-4.2/rpm/el7/noarch/ovirt-engine-appliance-4.2-20180903.1.el7.noarch.rpm:
>> (28, 'Connection timed out after 30001 milliseconds')
>> Trying other mirror.
>>
>> Error downloading packages:
>> ovirt-engine-appliance-4.2-20180903.1.el7.noarch: [Errno 256] No more
>> mirrors to try.
>> Uploading Enabled Repositories Report
>> Loaded plugins: fastestmirror, product-id, subscription-manager
>> This system is not registered with an entitlement server. You can use
>> subscription-manager to register.
>> Cannot upload enabled repos report, is this client registered?
>> "
>>
>>
>> Tried a couple of times, same result.
>> _______________________________________________
>> Users mailing list -- ***@ovirt.org
>> To unsubscribe send an email to users-***@ovirt.org
>> Privacy Statement: https://www.ovirt.org/site/privacy-policy/
>> oVirt Code of Conduct:
>> https://www.ovirt.org/community/about/community-guidelines/
>> List Archives:
>> https://lists.ovirt.org/archives/list/***@ovirt.org/message/A5GX3NQRGRJVKWD3KQFJ7KUZXIVKS5RY/
>>
> _______________________________________________
> Users mailing list -- ***@ovirt.org
> To unsubscribe send an email to users-***@ovirt.org
> Privacy Statement: https://www.ovirt.org/site/privacy-policy/
> oVirt Code of Conduct:
> https://www.ovirt.org/community/about/community-guidelines/
> List Archives:
> https://lists.ovirt.org/archives/list/***@ovirt.org/message/MKNPAM6WEWEFBKER6ADXRVEASPOW5HMZ/
>
m***@brendanh.com
2018-10-08 23:19:02 UTC
Permalink
Okay, I went back to using a bond (instead of an individual NIC). Above network problem is fixed and now proceeds as far as ever. Hangs for around 10 minutes at:

[ INFO ] TASK [Check engine VM health]
The hosted-engine-setup-ansible-create_target_vm log has:
2018-10-08 23:42:01,664+0100 INFO ansible task start {'status': 'OK', 'ansible_task': u'Check engine VM health', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}

Then repeats the following line for around 10 minutes:
2018-10-08 23:42:01,866+0100 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7f313cd338d0> kwargs

Before eventually, the console outputs the following error:
[ ERROR ] fatal: [localhost]: FAILED! => {"attempts": 120, "changed": true, "cmd": ["hosted-engine", "--vm-status", "--json"], "delta": "0:00:00.167677", "end": "2018-10-08 23:53:11.112436", "rc": 0, "start": "2018-10-08 23:53:10.944759", "stderr": "", "stderr_lines": [], "stdout": "{\"1\": {\"conf_on_shared_storage\": true, \"live-data\": true, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=49491 (Mon Oct 8 23:53:03 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=49491 (Mon Oct 8 23:53:03 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\", \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\": \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\", \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\": false, \"crc32\": \"75452be7\", \"local_conf_timestamp\": 49491, \"host-ts\": 49491}, \"global_maintenance\": false}", "stdout_lines": ["{\"1\": {\"c
onf_on_shared_storage\": true, \"live-data\": true, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=49491 (Mon Oct 8 23:53:03 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=49491 (Mon Oct 8 23:53:03 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\", \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\": \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\", \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\": false, \"crc32\": \"75452be7\", \"local_conf_timestamp\": 49491, \"host-ts\": 49491}, \"global_maintenance\": false}"]}
[ INFO ] TASK [Check VM status at virt level]

The hosted-engine-setup-ansible-create_target_vm log shows the following when this error occurs:

2018-10-08 23:53:11,812+0100 DEBUG var changed: host "localhost" var "ansible_failed_result" type "<type 'dict'>" value: "{
"_ansible_no_log": false,
"_ansible_parsed": true,
"attempts": 120,
"changed": true,
"cmd": [
"hosted-engine",
"--vm-status",
"--json"
],
"delta": "0:00:00.167677",
"end": "2018-10-08 23:53:11.112436",
"failed": true,
"invocation": {
"module_args": {
"_raw_params": "hosted-engine --vm-status --json",
"_uses_shell": false,
"argv": null,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"warn": true
}
},
"rc": 0,
"start": "2018-10-08 23:53:10.944759",
"stderr": "",
"stderr_lines": [],
"stdout": "{\"1\": {\"conf_on_shared_storage\": true, \"live-data\": true, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=49491 (Mon Oct 8 23:53:03 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=49491 (Mon Oct 8 23:53:03 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\", \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\": \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\", \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\": false, \"crc32\": \"75452be7\", \"local_conf_timestamp\": 49491, \"host-ts\": 49491}, \"global_maintenance\": false}",
"stdout_lines": [
"{\"1\": {\"conf_on_shared_storage\": true, \"live-data\": true, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=49491 (Mon Oct 8 23:53:03 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=49491 (Mon Oct 8 23:53:03 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\", \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\": \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\", \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\": false, \"crc32\": \"75452be7\", \"local_conf_timestamp\": 49491, \"host-ts\": 49491}, \"global_maintenance\": false}"
]
}"
2018-10-08 23:53:11,813+0100 DEBUG var changed: host "localhost" var "ansible_failed_task" type "<type 'dict'>" value: "{
"action": "command",
"any_errors_fatal": false,
"args": {
"_ansible_check_mode": false,
"_ansible_debug": false,
"_ansible_diff": false,
"_ansible_keep_remote_files": false,
"_ansible_module_name": "command",
"_ansible_no_log": false,
"_ansible_remote_tmp": "~/.ansible/tmp",
"_ansible_selinux_special_fs": [
"fuse",
"nfs",
"vboxsf",
"ramfs",
"9p"
],
"_ansible_shell_executable": "/bin/sh",
"_ansible_socket": null,
"_ansible_syslog_facility": "LOG_USER",
"_ansible_tmpdir": "/root/.ansible/tmp/ansible-tmp-1539039190.78-59276929025529/",
"_ansible_verbosity": 0,
"_ansible_version": "2.6.5",
"_raw_params": "hosted-engine --vm-status --json",
"warn": true
},
"async": 0,
"async_val": 0,
"become": null,
"become_flags": null,
"become_method": null,
"become_user": null,
"changed_when": [
true
],
"check_mode": null,
"connection": "local",
"debugger": null,
"delay": 5,
"delegate_facts": false,
"delegate_to": null,
"diff": null,
"environment": [
{
"LANG": "en_US.UTF-8",
"LC_ALL": "en_US.UTF-8",
"LC_MESSAGES": "en_US.UTF-8"
}
],
"failed_when": [],
"finalized": false,
"ignore_errors": null,
"loop": null,
"loop_control": null,
"loop_with": null,
"module_defaults": [],
"name": "Check engine VM health",
"no_log": null,
"notify": null,
"parent": {
"any_errors_fatal": null,
"become": null,
"become_flags": null,
"become_method": null,
"become_user": null,
"check_mode": null,
"connection": "local",
"debugger": null,
"delegate_facts": false,
"delegate_to": null,
"dep_chain": null,
"diff": null,
"environment": null,
"eor": false,
"ignore_errors": null,
"module_defaults": null,
"name": "Wait for the engine to come up on the target VM",
"no_log": null,
"port": null,
"remote_user": null,
"run_once": null,
"tags": [],
"vars": {},
"when": []
},
"parent_type": "Block",
"poll": 10,
"port": null,
"register": "health_result",
"remote_user": null,
"retries": 120,
"run_once": null,
"squashed": false,
"tags": [],
"until": [
"health_result.rc == 0 and 'health' in health_result.stdout and health_result.stdout|from_json|json_query('*.\"engine-status\".\"health\"')|first==\"good\""
],
"uuid": "525400d0-30bf-04f1-b63b-00000000006b",
"vars": {},
"when": []
}"
2018-10-08 23:53:11,813+0100 DEBUG var changed: host "localhost" var "health_result" type "<type 'dict'>" value: "{
"attempts": 120,
"changed": true,
"cmd": [
"hosted-engine",
"--vm-status",
"--json"
],
"delta": "0:00:00.167677",
"end": "2018-10-08 23:53:11.112436",
"failed": true,
"rc": 0,
"start": "2018-10-08 23:53:10.944759",
"stderr": "",
"stderr_lines": [],
"stdout": "{\"1\": {\"conf_on_shared_storage\": true, \"live-data\": true, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=49491 (Mon Oct 8 23:53:03 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=49491 (Mon Oct 8 23:53:03 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\", \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\": \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\", \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\": false, \"crc32\": \"75452be7\", \"local_conf_timestamp\": 49491, \"host-ts\": 49491}, \"global_maintenance\": false}",
"stdout_lines": [
"{\"1\": {\"conf_on_shared_storage\": true, \"live-data\": true, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=49491 (Mon Oct 8 23:53:03 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=49491 (Mon Oct 8 23:53:03 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\", \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\": \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\", \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\": false, \"crc32\": \"75452be7\", \"local_conf_timestamp\": 49491, \"host-ts\": 49491}, \"global_maintenance\": false}"
]
}"
2018-10-08 23:53:11,813+0100 ERROR ansible failed {'status': 'FAILED', 'ansible_type': 'task', 'ansible_task': u'Check engine VM health', 'ansible_result': u'type: <type \'dict\'>\nstr: {\'_ansible_parsed\': True, \'stderr_lines\': [], u\'changed\': True, u\'end\': u\'2018-10-08 23:53:11.112436\', \'_ansible_no_log\': False, u\'stdout\': u\'{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\\\nmetadata_feature_version=1\\\\ntimestamp=49491 (Mon Oct 8 23:53:', 'ansible_host': u'localhost', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml'}
2018-10-08 23:53:11,813+0100 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7f313cd338d0> kwargs ignore_errors:None
2018-10-08 23:53:12,517+0100 INFO ansible task start {'status': 'OK', 'ansible_task': u'Check VM status at virt level', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}
2018-10-08 23:53:12,517+0100 DEBUG ansible on_any args TASK: Check VM status at virt level kwargs is_conditional:False
2018-10-08 23:53:13,433+0100 DEBUG var changed: host "localhost" var "vm_status_virsh" type "<type 'dict'>" value: "{
"changed": true,
"cmd": "virsh -r list | grep HostedEngine | grep running",
"delta": "0:00:00.053603",
"end": "2018-10-08 23:53:12.741182",
"failed": false,
"rc": 0,
"start": "2018-10-08 23:53:12.687579",
"stderr": "",
"stderr_lines": [],
"stdout": " 2 HostedEngine running",
"stdout_lines": [
" 2 HostedEngine running"
]
}"
2018-10-08 23:53:13,434+0100 INFO ansible ok {'status': 'OK', 'ansible_task': u'Check VM status at virt level', 'ansible_host': u'localhost', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}
2018-10-08 23:53:13,434+0100 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7f313caedc10> kwargs
2018-10-08 23:53:14,159+0100 INFO ansible task start {'status': 'OK', 'ansible_task': u'debug', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}
2018-10-08 23:53:14,159+0100 DEBUG ansible on_any args TASK: debug kwargs is_conditional:False
2018-10-08 23:53:14,861+0100 INFO ansible ok {'status': 'OK', 'ansible_task': u'', 'ansible_host': u'localhost', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}
2018-10-08 23:53:14,861+0100 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7f313c161ad0> kwargs
2018-10-08 23:53:15,575+0100 INFO ansible task start {'status': 'OK', 'ansible_task': u'Fail if engine VM is not running', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}
2018-10-08 23:53:15,576+0100 DEBUG ansible on_any args TASK: Fail if engine VM is not running kwargs is_conditional:False
2018-10-08 23:53:16,261+0100 INFO ansible skipped {'status': 'SKIPPED', 'ansible_task': u'Fail if engine VM is not running', 'ansible_host': u'localhost', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}
2018-10-08 23:53:16,261+0100 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7f313ccec510> kwargs
2018-10-08 23:53:16,967+0100 INFO ansible task start {'status': 'OK', 'ansible_task': u'Get target engine VM IPv4 address', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}
2018-10-08 23:53:16,968+0100 DEBUG ansible on_any args TASK: Get target engine VM IPv4 address kwargs is_conditional:False
2018-10-08 23:53:17,819+0100 DEBUG var changed: host "localhost" var "engine_vm_ipv4" type "<type 'dict'>" value: "{
"changed": true,
"cmd": "getent ahostsv4 ovirt-engine.example.com | cut -d' ' -f1 | uniq",
"delta": "0:00:00.004903",
"end": "2018-10-08 23:53:17.143283",
"failed": false,
"rc": 0,
"start": "2018-10-08 23:53:17.138380",
"stderr": "",
"stderr_lines": [],
"stdout": "10.0.0.109",
"stdout_lines": [
"10.0.0.109"
]
}"
2018-10-08 23:53:17,819+0100 INFO ansible ok {'status': 'OK', 'ansible_task': u'Get target engine VM IPv4 address', 'ansible_host': u'localhost', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}
2018-10-08 23:53:17,819+0100 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7f313caedc10> kwargs
2018-10-08 23:53:18,531+0100 INFO ansible task start {'status': 'OK', 'ansible_task': u"Get VDSM's target engine VM stats", 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}
2018-10-08 23:53:18,531+0100 DEBUG ansible on_any args TASK: Get VDSM's target engine VM stats kwargs is_conditional:False
2018-10-08 23:53:19,620+0100 DEBUG var changed: host "localhost" var "engine_vdsm_stats" type "<type 'dict'>" value: "{
"changed": true,
"cmd": [
"vdsm-client",
"VM",
"getStats",
"vmID=e1af6b26-9e48-251-940c-7bfadf920f3f"
],
"delta": "0:00:00.257926",
"end": "2018-10-08 23:53:18.961394",
"failed": false,
"rc": 0,
"start": "2018-10-08 23:53:18.703468",
"stderr": "",
"stderr_lines": [],
"stdout": "[\n {\n \"displayInfo\": [\n {\n \"tlsPort\": \"5901\", \n \"ipAddress\": \"10.0.0.171\", \n \"port\": \"5900\", \n \"type\": \"spice\"\n }, \n {\n \"tlsPort\": \"-1\", \n \"ipAddress\": \"10.0.0.171\", \n \"port\": \"5902\", \n \"type\": \"vnc\"\n }\n ], \n \"memUsage\": \"18\", \n \"acpiEnable\": \"true\", \n \"guestFQDN\": \"ovirt-engine.example.com\", \n \"vmId\": \"e1af6b26-9e48-251-940c-7bfadf920f3f\", \n \"session\": \"Unknown\", \n \"netIfaces\": [\n {\n \"name\": \"eth0\", \n \"inet6\": [], \n \"inet\": [], \n \"hw\": \"00:18:3d:5b:11:5c\"\n }\n ], \n \"timeOffset\": \"0\", \n \"memoryStats\": {\n \"swap_out\": 0, \n \
"majflt\": 0, \n \"minflt\": 198, \n \"mem_cached\": \"447696\", \n \"mem_free\": \"13259132\", \n \"mem_buffers\": \"2104\", \n \"swap_in\": 0, \n \"pageflt\": 198, \n \"mem_total\": \"16258492\", \n \"mem_unused\": \"13259132\"\n }, \n \"balloonInfo\": {\n \"balloon_max\": \"16777216\", \n \"balloon_cur\": \"16777216\", \n \"balloon_target\": \"16777216\", \n \"balloon_min\": \"1048576\"\n }, \n \"pauseCode\": \"NOERR\", \n \"disksUsage\": [\n {\n \"path\": \"/\", \n \"total\": \"6565134336\", \n \"used\": \"206525344\", \n \"fs\": \"xfs\"\n }, \n {\n \"path\": \"/boot\", \n \"total\": \"1063256064\", \n \"used\": \"169177088\", \n \"fs\": \"xfs\"\n }, \n
{\n \"path\": \"/tmp\", \n \"total\": \"2136997888\", \n \"used\": \"33935360\", \n \"fs\": \"xfs\"\n }, \n {\n \"path\": \"/home\", \n \"total\": \"1063256064\", \n \"used\": \"33796096\", \n \"fs\": \"xfs\"\n }, \n {\n \"path\": \"/var\", \n \"total\": \"21464350720\", \n \"used\": \"456179712\", \n \"fs\": \"xfs\"\n }, \n {\n \"path\": \"/var/log\", \n \"total\": \"10726932480\", \n \"used\": \"43315200\", \n \"fs\": \"xfs\"\n }, \n {\n \"path\": \"/var/log/audit\", \n \"total\": \"1063256064\", \n \"used\": \"3252160\", \n \"fs\": \"xfs\"\n }\n ], \n \"network\":
{\n \"vnet0\": {\n \"macAddr\": \"00:18:3d:5b:11:5c\", \n \"rxDropped\": \"0\", \n \"tx\": \"1710\", \n \"txDropped\": \"0\", \n \"rxErrors\": \"0\", \n \"rx\": \"349635\", \n \"txErrors\": \"0\", \n \"state\": \"unknown\", \n \"sampleTime\": 4344173.61, \n \"speed\": \"1000\", \n \"name\": \"vnet0\"\n }\n }, \n \"vmType\": \"kvm\", \n \"cpuUser\": \"3.00\", \n \"elapsedTime\": \"585\", \n \"vmJobs\": {}, \n \"cpuSys\": \"1.27\", \n \"appsList\": [\n \"kernel-3.10.0-862.11.6.el7\", \n \"cloud-init-0.7.9-24.el7.centos.1\", \n \"ovirt-guest-agent-common-1.0.14-1.el7\"\n ], \n \"guestOs\": \"3.10.0-862.11.6.el7.x86_64\", \n \"vmName\": \"HostedEngine\", \n \"vcpuCount\": \"4\", \n \"has
h\": \"3205592835746233126\", \n \"lastLogin\": 1539038623.192216, \n \"cpuUsage\": \"11670000000\", \n \"vcpuPeriod\": 100000, \n \"guestIPs\": \"\", \n \"guestTimezone\": {\n \"zone\": \"Europe/London\", \n \"offset\": 0\n }, \n \"vcpuQuota\": \"-1\", \n \"guestContainers\": [], \n \"kvmEnable\": \"true\", \n \"disks\": {\n \"vda\": {\n \"readLatency\": \"0\", \n \"writtenBytes\": \"44758528\", \n \"writeOps\": \"1063\", \n \"apparentsize\": \"53687091200\", \n \"readOps\": \"16151\", \n \"writeLatency\": \"631097\", \n \"imageID\": \"758f667c-6e9b-43eb-b09b-c983d78a6374\", \n \"readBytes\": \"475618816\", \n \"flushLatency\": \"30762\", \n \"readRate\": \"0.0\", \n \"truesize\": \"2670891008\", \n \"wr
iteRate\": \"3276.8\"\n }, \n \"hdc\": {\n \"readLatency\": \"0\", \n \"writtenBytes\": \"0\", \n \"writeOps\": \"0\", \n \"apparentsize\": \"0\", \n \"readOps\": \"4\", \n \"writeLatency\": \"0\", \n \"readBytes\": \"152\", \n \"flushLatency\": \"0\", \n \"readRate\": \"0.0\", \n \"truesize\": \"0\", \n \"writeRate\": \"0.0\"\n }\n }, \n \"monitorResponse\": \"0\", \n \"guestOsInfo\": {\n \"kernel\": \"3.10.0-862.11.6.el7.x86_64\", \n \"type\": \"linux\", \n \"version\": \"7.5.1804\", \n \"distribution\": \"CentOS Linux\", \n \"arch\": \"x86_64\", \n \"codename\": \"Core\"\n }, \n \"username\": \"None\", \n \"guestName\": \"ovirt-engine.example.com\", \n \"status\": \"Up\",
\n \"guestCPUCount\": 4, \n \"clientIp\": \"\", \n \"statusTime\": \"4344173610\"\n }\n]",
"stdout_lines": [
"[",
" {",
" \"displayInfo\": [",
" {",
" \"tlsPort\": \"5901\", ",
" \"ipAddress\": \"10.0.0.171\", ",
" \"port\": \"5900\", ",
" \"type\": \"spice\"",
" }, ",
" {",
" \"tlsPort\": \"-1\", ",
" \"ipAddress\": \"10.0.0.171\", ",
" \"port\": \"5902\", ",
" \"type\": \"vnc\"",
" }",
" ], ",
" \"memUsage\": \"18\", ",
" \"acpiEnable\": \"true\", ",
" \"guestFQDN\": \"ovirt-engine.example.com\", ",
" \"vmId\": \"e1af6b26-9e48-251-940c-7bfadf920f3f\", ",
" \"session\": \"Unknown\", ",
" \"netIfaces\": [",
" {",
" \"name\": \"eth0\", ",
" \"inet6\": [], ",
" \"inet\": [], ",
" \"hw\": \"00:18:3d:5b:11:5c\"",
" }",
" ], ",
" \"timeOffset\": \"0\", ",
" \"memoryStats\": {",
" \"swap_out\": 0, ",
" \"majflt\": 0, ",
" \"minflt\": 198, ",
" \"mem_cached\": \"447696\", ",
" \"mem_free\": \"13259132\", ",
" \"mem_buffers\": \"2104\", ",
" \"swap_in\": 0, ",
" \"pageflt\": 198, ",
" \"mem_total\": \"16258492\", ",
" \"mem_unused\": \"13259132\"",
" }, ",
" \"balloonInfo\": {",
" \"balloon_max\": \"16777216\", ",
" \"balloon_cur\": \"16777216\", ",
" \"balloon_target\": \"16777216\", ",
" \"balloon_min\": \"1048576\"",
" }, ",
" \"pauseCode\": \"NOERR\", ",
" \"disksUsage\": [",
" {",
" \"path\": \"/\", ",
" \"total\": \"6565134336\", ",
" \"used\": \"206525344\", ",
" \"fs\": \"xfs\"",
" }, ",
" {",
" \"path\": \"/boot\", ",
" \"total\": \"1063256064\", ",
" \"used\": \"169177088\", ",
" \"fs\": \"xfs\"",
" }, ",
" {",
" \"path\": \"/tmp\", ",
" \"total\": \"2136997888\", ",
" \"used\": \"33935360\", ",
" \"fs\": \"xfs\"",
" }, ",
" {",
" \"path\": \"/home\", ",
" \"total\": \"1063256064\", ",
" \"used\": \"33796096\", ",
" \"fs\": \"xfs\"",
" }, ",
" {",
" \"path\": \"/var\", ",
" \"total\": \"21464350720\", ",
" \"used\": \"456179712\", ",
" \"fs\": \"xfs\"",
" }, ",
" {",
" \"path\": \"/var/log\", ",
" \"total\": \"10726932480\", ",
" \"used\": \"43315200\", ",
" \"fs\": \"xfs\"",
" }, ",
" {",
" \"path\": \"/var/log/audit\", ",
" \"total\": \"1063256064\", ",
" \"used\": \"3252160\", ",
" \"fs\": \"xfs\"",
" }",
" ], ",
" \"network\": {",
" \"vnet0\": {",
" \"macAddr\": \"00:18:3d:5b:11:5c\", ",
" \"rxDropped\": \"0\", ",
" \"tx\": \"1710\", ",
" \"txDropped\": \"0\", ",
" \"rxErrors\": \"0\", ",
" \"rx\": \"349635\", ",
" \"txErrors\": \"0\", ",
" \"state\": \"unknown\", ",
" \"sampleTime\": 4344173.61, ",
" \"speed\": \"1000\", ",
" \"name\": \"vnet0\"",
" }",
" }, ",
" \"vmType\": \"kvm\", ",
" \"cpuUser\": \"3.00\", ",
" \"elapsedTime\": \"585\", ",
" \"vmJobs\": {}, ",
" \"cpuSys\": \"1.27\", ",
" \"appsList\": [",
" \"kernel-3.10.0-862.11.6.el7\", ",
" \"cloud-init-0.7.9-24.el7.centos.1\", ",
" \"ovirt-guest-agent-common-1.0.14-1.el7\"",
" ], ",
" \"guestOs\": \"3.10.0-862.11.6.el7.x86_64\", ",
" \"vmName\": \"HostedEngine\", ",
" \"vcpuCount\": \"4\", ",
" \"hash\": \"3205592835746233126\", ",
" \"lastLogin\": 1539038623.192216, ",
" \"cpuUsage\": \"11670000000\", ",
" \"vcpuPeriod\": 100000, ",
" \"guestIPs\": \"\", ",
" \"guestTimezone\": {",
" \"zone\": \"Europe/London\", ",
" \"offset\": 0",
" }, ",
" \"vcpuQuota\": \"-1\", ",
" \"guestContainers\": [], ",
" \"kvmEnable\": \"true\", ",
" \"disks\": {",
" \"vda\": {",
" \"readLatency\": \"0\", ",
" \"writtenBytes\": \"44758528\", ",
" \"writeOps\": \"1063\", ",
" \"apparentsize\": \"53687091200\", ",
" \"readOps\": \"16151\", ",
" \"writeLatency\": \"631097\", ",
" \"imageID\": \"758f667c-6e9b-43eb-b09b-c983d78a6374\", ",
" \"readBytes\": \"475618816\", ",
" \"flushLatency\": \"30762\", ",
" \"readRate\": \"0.0\", ",
" \"truesize\": \"2670891008\", ",
" \"writeRate\": \"3276.8\"",
" }, ",
" \"hdc\": {",
" \"readLatency\": \"0\", ",
" \"writtenBytes\": \"0\", ",
" \"writeOps\": \"0\", ",
" \"apparentsize\": \"0\", ",
" \"readOps\": \"4\", ",
" \"writeLatency\": \"0\", ",
" \"readBytes\": \"152\", ",
" \"flushLatency\": \"0\", ",
" \"readRate\": \"0.0\", ",
" \"truesize\": \"0\", ",
" \"writeRate\": \"0.0\"",
" }",
" }, ",
" \"monitorResponse\": \"0\", ",
" \"guestOsInfo\": {",
" \"kernel\": \"3.10.0-862.11.6.el7.x86_64\", ",
" \"type\": \"linux\", ",
" \"version\": \"7.5.1804\", ",
" \"distribution\": \"CentOS Linux\", ",
" \"arch\": \"x86_64\", ",
" \"codename\": \"Core\"",
" }, ",
" \"username\": \"None\", ",
" \"guestName\": \"ovirt-engine.example.com\", ",
" \"status\": \"Up\", ",
" \"guestCPUCount\": 4, ",
" \"clientIp\": \"\", ",
" \"statusTime\": \"4344173610\"",
" }",
"]"
]
}"
2018-10-08 23:53:19,620+0100 INFO ansible ok {'status': 'OK', 'ansible_task': u"Get VDSM's target engine VM stats", 'ansible_host': u'localhost', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}
2018-10-08 23:53:19,620+0100 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7f313cc17e10> kwargs
2018-10-08 23:53:20,321+0100 INFO ansible task start {'status': 'OK', 'ansible_task': u'Convert stats to JSON format', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}
2018-10-08 23:53:20,322+0100 DEBUG ansible on_any args TASK: Convert stats to JSON format kwargs is_conditional:False
2018-10-08 23:53:21,008+0100 DEBUG var changed: host "localhost" var "json_stats" type "<type 'list'>" value: "[
{
"acpiEnable": "true",
"appsList": [
"kernel-3.10.0-862.11.6.el7",
"cloud-init-0.7.9-24.el7.centos.1",
"ovirt-guest-agent-common-1.0.14-1.el7"
],
"balloonInfo": {
"balloon_cur": "16777216",
"balloon_max": "16777216",
"balloon_min": "1048576",
"balloon_target": "16777216"
},
"clientIp": "",
"cpuSys": "1.27",
"cpuUsage": "11670000000",
"cpuUser": "3.00",
"disks": {
"hdc": {
"apparentsize": "0",
"flushLatency": "0",
"readBytes": "152",
"readLatency": "0",
"readOps": "4",
"readRate": "0.0",
"truesize": "0",
"writeLatency": "0",
"writeOps": "0",
"writeRate": "0.0",
"writtenBytes": "0"
},
"vda": {
"apparentsize": "53687091200",
"flushLatency": "30762",
"imageID": "758f667c-6e9b-43eb-b09b-c983d78a6374",
"readBytes": "475618816",
"readLatency": "0",
"readOps": "16151",
"readRate": "0.0",
"truesize": "2670891008",
"writeLatency": "631097",
"writeOps": "1063",
"writeRate": "3276.8",
"writtenBytes": "44758528"
}
},
"disksUsage": [
{
"fs": "xfs",
"path": "/",
"total": "6565134336",
"used": "206525344"
},
{
"fs": "xfs",
"path": "/boot",
"total": "1063256064",
"used": "169177088"
},
{
"fs": "xfs",
"path": "/tmp",
"total": "2136997888",
"used": "33935360"
},
{
"fs": "xfs",
"path": "/home",
"total": "1063256064",
"used": "33796096"
},
{
"fs": "xfs",
"path": "/var",
"total": "21464350720",
"used": "456179712"
},
{
"fs": "xfs",
"path": "/var/log",
"total": "10726932480",
"used": "43315200"
},
{
"fs": "xfs",
"path": "/var/log/audit",
"total": "1063256064",
"used": "3252160"
}
],
"displayInfo": [
{
"ipAddress": "10.0.0.171",
"port": "5900",
"tlsPort": "5901",
"type": "spice"
},
{
"ipAddress": "10.0.0.171",
"port": "5902",
"tlsPort": "-1",
"type": "vnc"
}
],
"elapsedTime": "585",
"guestCPUCount": 4,
"guestContainers": [],
"guestFQDN": "ovirt-engine.example.com",
"guestIPs": "",
"guestName": "ovirt-engine.example.com",
"guestOs": "3.10.0-862.11.6.el7.x86_64",
"guestOsInfo": {
"arch": "x86_64",
"codename": "Core",
"distribution": "CentOS Linux",
"kernel": "3.10.0-862.11.6.el7.x86_64",
"type": "linux",
"version": "7.5.1804"
},
"guestTimezone": {
"offset": 0,
"zone": "Europe/London"
},
"hash": "3205592835746233126",
"kvmEnable": "true",
"lastLogin": 1539038623.192216,
"memUsage": "18",
"memoryStats": {
"majflt": 0,
"mem_buffers": "2104",
"mem_cached": "447696",
"mem_free": "13259132",
"mem_total": "16258492",
"mem_unused": "13259132",
"minflt": 198,
"pageflt": 198,
"swap_in": 0,
"swap_out": 0
},
"monitorResponse": "0",
"netIfaces": [
{
"hw": "00:18:3d:5b:11:5c",
"inet": [],
"inet6": [],
"name": "eth0"
}
],
"network": {
"vnet0": {
"macAddr": "00:18:3d:5b:11:5c",
"name": "vnet0",
"rx": "349635",
"rxDropped": "0",
"rxErrors": "0",
"sampleTime": 4344173.61,
"speed": "1000",
"state": "unknown",
"tx": "1710",
"txDropped": "0",
"txErrors": "0"
}
},
"pauseCode": "NOERR",
"session": "Unknown",
"status": "Up",
"statusTime": "4344173610",
"timeOffset": "0",
"username": "None",
"vcpuCount": "4",
"vcpuPeriod": 100000,
"vcpuQuota": "-1",
"vmId": "e1af6b26-9e48-251-940c-7bfadf920f3f",
"vmJobs": {},
"vmName": "HostedEngine",
"vmType": "kvm"
}
]"
2018-10-08 23:53:21,009+0100 INFO ansible ok {'status': 'OK', 'ansible_task': u'Convert stats to JSON format', 'ansible_host': u'localhost', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}
2018-10-08 23:53:21,009+0100 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7f313db268d0> kwargs
2018-10-08 23:53:21,720+0100 INFO ansible task start {'status': 'OK', 'ansible_task': u'Get target engine VM IPv4 address from VDSM stats', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}
2018-10-08 23:53:21,720+0100 DEBUG ansible on_any args TASK: Get target engine VM IPv4 address from VDSM stats kwargs is_conditional:False
2018-10-08 23:53:22,397+0100 DEBUG var changed: host "localhost" var "engine_vm_ip_vdsm" type "<class 'ansible.utils.unsafe_proxy.AnsibleUnsafeText'>" value: """"
2018-10-08 23:53:22,398+0100 INFO ansible ok {'status': 'OK', 'ansible_task': u'Get target engine VM IPv4 address from VDSM stats', 'ansible_host': u'localhost', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}
2018-10-08 23:53:22,398+0100 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7f313caed510> kwargs
2018-10-08 23:53:23,124+0100 INFO ansible task start {'status': 'OK', 'ansible_task': u'debug', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}
2018-10-08 23:53:23,124+0100 DEBUG ansible on_any args TASK: debug kwargs is_conditional:False
2018-10-08 23:53:23,807+0100 INFO ansible ok {'status': 'OK', 'ansible_task': u'', 'ansible_host': u'localhost', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}
2018-10-08 23:53:23,807+0100 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7f313ca343d0> kwargs
2018-10-08 23:53:24,511+0100 INFO ansible task start {'status': 'OK', 'ansible_task': u'Fail if the Engine has no IP address', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'task'}
2018-10-08 23:53:24,511+0100 DEBUG ansible on_any args TASK: Fail if the Engine has no IP address kwargs is_conditional:False
2018-10-08 23:53:25,152+0100 DEBUG var changed: host "localhost" var "ansible_play_hosts" type "<type 'list'>" value: "[]"
2018-10-08 23:53:25,152+0100 DEBUG var changed: host "localhost" var "play_hosts" type "<type 'list'>" value: "[]"
2018-10-08 23:53:25,152+0100 DEBUG var changed: host "localhost" var "ansible_play_batch" type "<type 'list'>" value: "[]"
2018-10-08 23:53:25,153+0100 ERROR ansible failed {'status': 'FAILED', 'ansible_type': 'task', 'ansible_task': u'Fail if the Engine has no IP address', 'ansible_result': u"type: <type 'dict'>\nstr: {'msg': u'Engine VM has no IP address. Please check your network configuration', 'changed': False, '_ansible_no_log': False}", 'ansible_host': u'localhost', 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml'}
2018-10-08 23:53:25,153+0100 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7f313caed6d0> kwargs ignore_errors:None
2018-10-08 23:53:25,154+0100 INFO ansible stats {'status': 'FAILED', 'ansible_playbook_duration': 1064.367312, 'ansible_result': u"type: <type 'dict'>\nstr: {u'ovirt-engine.example.com': {'unreachable': 0, 'skipped': 0, 'ok': 11, 'changed': 6, 'failures': 0}, u'localhost': {'unreachable': 0, 'skipped': 5, 'ok': 82, 'changed': 30, 'failures': 2}}", 'ansible_playbook': u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml', 'ansible_type': 'finish'}
2018-10-08 23:53:25,154+0100 DEBUG ansible on_any args <ansible.executor.stats.AggregateStats object at 0x7f313ef6c290> kwargs


I run it again (I clear the shared NFS storage each attempt), then get the following output to console:
[ INFO ] TASK [Undefine leftover engine VM]
[ ERROR ] fatal: [localhost]: FAILED! => {"changed": true, "cmd": ["virsh", "undefine", "--managed-save", "HostedEngine"], "delta": "0:00:00.050407", "end": "2018-10-09 00:16:58.085198", "msg": "non-zero return code", "rc": 1, "start": "2018-10-09 00:16:58.034791", "stderr": "error: Failed to undefine domain HostedEngine\nerror: Requested operation is not valid: cannot undefine transient domain", "stderr_lines": ["error: Failed to undefine domain HostedEngine", "error: Requested operation is not valid: cannot undefine transient domain"], "stdout": "", "stdout_lines": []}
_______________________________________________
Users mailing list -- ***@ovirt.org
To unsubscribe send an email to users-***@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: https://www.ovirt.org/community/about/community-guidelines/
List Archives: https://lists.ovirt.org/archives/list/***@ovir
m***@brendanh.com
2018-10-08 23:22:53 UTC
Permalink
I should qualify "I went back to using a bond (instead of an individual NIC). Above network problem
is fixed and now proceeds as far as ever."

After running yum install ovirt-engine-appliance I ran the usual hosted-engine -deploy command again per the advice here:
https://ovirt.org/documentation/how-to/hosted-engine/
I trust this is correct.
_______________________________________________
Users mailing list -- ***@ovirt.org
To unsubscribe send an email to users-***@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: https://www.ovirt.org/community/about/community-guidelines/
List Archives: https://lists.ovirt.org/archives/list/***@ovirt.org/message/ET2VSLET3WUHXC47JEQVU75GNTTX
Simone Tiraboschi
2018-10-09 07:47:59 UTC
Permalink
On Tue, Oct 9, 2018 at 1:21 AM <***@brendanh.com> wrote:

> Okay, I went back to using a bond (instead of an individual NIC). Above
> network problem is fixed and now proceeds as far as ever. Hangs for around
> 10 minutes at:
>
> [ INFO ] TASK [Check engine VM health]
> The hosted-engine-setup-ansible-create_target_vm log has:
> 2018-10-08 23:42:01,664+0100 INFO ansible task start {'status': 'OK',
> 'ansible_task': u'Check engine VM health', 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
>
> Then repeats the following line for around 10 minutes:
> 2018-10-08 23:42:01,866+0100 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7f313cd338d0> kwargs
>
> Before eventually, the console outputs the following error:
> [ ERROR ] fatal: [localhost]: FAILED! => {"attempts": 120, "changed":
> true, "cmd": ["hosted-engine", "--vm-status", "--json"], "delta":
> "0:00:00.167677", "end": "2018-10-08 23:53:11.112436", "rc": 0, "start":
> "2018-10-08 23:53:10.944759", "stderr": "", "stderr_lines": [], "stdout":
> "{\"1\": {\"conf_on_shared_storage\": true, \"live-data\": true, \"extra\":
> \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=49491
> (Mon Oct 8 23:53:03
> 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=49491 (Mon Oct 8
> 23:53:03
> 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\",
> \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\":
> \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\",
> \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\":
> false, \"crc32\": \"75452be7\", \"local_conf_timestamp\": 49491,
> \"host-ts\": 49491}, \"global_maintenance\": false}", "stdout_lines":
> ["{\"1\": {\"c
> onf_on_shared_storage\": true, \"live-data\": true, \"extra\":
> \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=49491
> (Mon Oct 8 23:53:03
> 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=49491 (Mon Oct 8
> 23:53:03
> 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\",
> \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\":
> \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\",
> \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\":
> false, \"crc32\": \"75452be7\", \"local_conf_timestamp\": 49491,
> \"host-ts\": 49491}, \"global_maintenance\": false}"]}
> [ INFO ] TASK [Check VM status at virt level]
>
> The hosted-engine-setup-ansible-create_target_vm log shows the following
> when this error occurs:
>
> 2018-10-08 23:53:11,812+0100 DEBUG var changed: host "localhost" var
> "ansible_failed_result" type "<type 'dict'>" value: "{
> "_ansible_no_log": false,
> "_ansible_parsed": true,
> "attempts": 120,
> "changed": true,
> "cmd": [
> "hosted-engine",
> "--vm-status",
> "--json"
> ],
> "delta": "0:00:00.167677",
> "end": "2018-10-08 23:53:11.112436",
> "failed": true,
> "invocation": {
> "module_args": {
> "_raw_params": "hosted-engine --vm-status --json",
> "_uses_shell": false,
> "argv": null,
> "chdir": null,
> "creates": null,
> "executable": null,
> "removes": null,
> "stdin": null,
> "warn": true
> }
> },
> "rc": 0,
> "start": "2018-10-08 23:53:10.944759",
> "stderr": "",
> "stderr_lines": [],
> "stdout": "{\"1\": {\"conf_on_shared_storage\": true, \"live-data\":
> true, \"extra\":
> \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=49491
> (Mon Oct 8 23:53:03
> 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=49491 (Mon Oct 8
> 23:53:03
> 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\",
> \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\":
> \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\",
> \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\":
> false, \"crc32\": \"75452be7\", \"local_conf_timestamp\": 49491,
> \"host-ts\": 49491}, \"global_maintenance\": false}",
>

This is usually a name resolution issue:
"vm": "up" - this is checked at virt level
"reason": "failed liveliness check", "health": "bad" - this is checked
from the host over http

I'd suggest to double check that the host can correctly resolve the name of
the engine VM and that the engine VM correctly got the address where its
FQDN resolves to.
Do you have a properly working DHCP with a DHCP reservation for the engine
VM? did you set a static IP on the engine VM?


> "stdout_lines": [
> "{\"1\": {\"conf_on_shared_storage\": true, \"live-data\": true,
> \"extra\":
> \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=49491
> (Mon Oct 8 23:53:03
> 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=49491 (Mon Oct 8
> 23:53:03
> 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\",
> \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\":
> \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\",
> \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\":
> false, \"crc32\": \"75452be7\", \"local_conf_timestamp\": 49491,
> \"host-ts\": 49491}, \"global_maintenance\": false}"
> ]
> }"
> 2018-10-08 23:53:11,813+0100 DEBUG var changed: host "localhost" var
> "ansible_failed_task" type "<type 'dict'>" value: "{
> "action": "command",
> "any_errors_fatal": false,
> "args": {
> "_ansible_check_mode": false,
> "_ansible_debug": false,
> "_ansible_diff": false,
> "_ansible_keep_remote_files": false,
> "_ansible_module_name": "command",
> "_ansible_no_log": false,
> "_ansible_remote_tmp": "~/.ansible/tmp",
> "_ansible_selinux_special_fs": [
> "fuse",
> "nfs",
> "vboxsf",
> "ramfs",
> "9p"
> ],
> "_ansible_shell_executable": "/bin/sh",
> "_ansible_socket": null,
> "_ansible_syslog_facility": "LOG_USER",
> "_ansible_tmpdir":
> "/root/.ansible/tmp/ansible-tmp-1539039190.78-59276929025529/",
> "_ansible_verbosity": 0,
> "_ansible_version": "2.6.5",
> "_raw_params": "hosted-engine --vm-status --json",
> "warn": true
> },
> "async": 0,
> "async_val": 0,
> "become": null,
> "become_flags": null,
> "become_method": null,
> "become_user": null,
> "changed_when": [
> true
> ],
> "check_mode": null,
> "connection": "local",
> "debugger": null,
> "delay": 5,
> "delegate_facts": false,
> "delegate_to": null,
> "diff": null,
> "environment": [
> {
> "LANG": "en_US.UTF-8",
> "LC_ALL": "en_US.UTF-8",
> "LC_MESSAGES": "en_US.UTF-8"
> }
> ],
> "failed_when": [],
> "finalized": false,
> "ignore_errors": null,
> "loop": null,
> "loop_control": null,
> "loop_with": null,
> "module_defaults": [],
> "name": "Check engine VM health",
> "no_log": null,
> "notify": null,
> "parent": {
> "any_errors_fatal": null,
> "become": null,
> "become_flags": null,
> "become_method": null,
> "become_user": null,
> "check_mode": null,
> "connection": "local",
> "debugger": null,
> "delegate_facts": false,
> "delegate_to": null,
> "dep_chain": null,
> "diff": null,
> "environment": null,
> "eor": false,
> "ignore_errors": null,
> "module_defaults": null,
> "name": "Wait for the engine to come up on the target VM",
> "no_log": null,
> "port": null,
> "remote_user": null,
> "run_once": null,
> "tags": [],
> "vars": {},
> "when": []
> },
> "parent_type": "Block",
> "poll": 10,
> "port": null,
> "register": "health_result",
> "remote_user": null,
> "retries": 120,
> "run_once": null,
> "squashed": false,
> "tags": [],
> "until": [
> "health_result.rc == 0 and 'health' in health_result.stdout and
> health_result.stdout|from_json|json_query('*.\"engine-status\".\"health\"')|first==\"good\""
> ],
> "uuid": "525400d0-30bf-04f1-b63b-00000000006b",
> "vars": {},
> "when": []
> }"
> 2018-10-08 23:53:11,813+0100 DEBUG var changed: host "localhost" var
> "health_result" type "<type 'dict'>" value: "{
> "attempts": 120,
> "changed": true,
> "cmd": [
> "hosted-engine",
> "--vm-status",
> "--json"
> ],
> "delta": "0:00:00.167677",
> "end": "2018-10-08 23:53:11.112436",
> "failed": true,
> "rc": 0,
> "start": "2018-10-08 23:53:10.944759",
> "stderr": "",
> "stderr_lines": [],
> "stdout": "{\"1\": {\"conf_on_shared_storage\": true, \"live-data\":
> true, \"extra\":
> \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=49491
> (Mon Oct 8 23:53:03
> 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=49491 (Mon Oct 8
> 23:53:03
> 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\",
> \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\":
> \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\",
> \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\":
> false, \"crc32\": \"75452be7\", \"local_conf_timestamp\": 49491,
> \"host-ts\": 49491}, \"global_maintenance\": false}",
> "stdout_lines": [
> "{\"1\": {\"conf_on_shared_storage\": true, \"live-data\": true,
> \"extra\":
> \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=49491
> (Mon Oct 8 23:53:03
> 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=49491 (Mon Oct 8
> 23:53:03
> 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\",
> \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\":
> \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\",
> \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\":
> false, \"crc32\": \"75452be7\", \"local_conf_timestamp\": 49491,
> \"host-ts\": 49491}, \"global_maintenance\": false}"
> ]
> }"
> 2018-10-08 23:53:11,813+0100 ERROR ansible failed {'status': 'FAILED',
> 'ansible_type': 'task', 'ansible_task': u'Check engine VM health',
> 'ansible_result': u'type: <type \'dict\'>\nstr: {\'_ansible_parsed\': True,
> \'stderr_lines\': [], u\'changed\': True, u\'end\': u\'2018-10-08
> 23:53:11.112436\', \'_ansible_no_log\': False, u\'stdout\': u\'{"1":
> {"conf_on_shared_storage": true, "live-data": true, "extra":
> "metadata_parse_version=1\\\\nmetadata_feature_version=1\\\\ntimestamp=49491
> (Mon Oct 8 23:53:', 'ansible_host': u'localhost', 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml'}
> 2018-10-08 23:53:11,813+0100 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7f313cd338d0> kwargs
> ignore_errors:None
> 2018-10-08 23:53:12,517+0100 INFO ansible task start {'status': 'OK',
> 'ansible_task': u'Check VM status at virt level', 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
> 2018-10-08 23:53:12,517+0100 DEBUG ansible on_any args TASK: Check VM
> status at virt level kwargs is_conditional:False
> 2018-10-08 23:53:13,433+0100 DEBUG var changed: host "localhost" var
> "vm_status_virsh" type "<type 'dict'>" value: "{
> "changed": true,
> "cmd": "virsh -r list | grep HostedEngine | grep running",
> "delta": "0:00:00.053603",
> "end": "2018-10-08 23:53:12.741182",
> "failed": false,
> "rc": 0,
> "start": "2018-10-08 23:53:12.687579",
> "stderr": "",
> "stderr_lines": [],
> "stdout": " 2 HostedEngine running",
> "stdout_lines": [
> " 2 HostedEngine running"
> ]
> }"
> 2018-10-08 23:53:13,434+0100 INFO ansible ok {'status': 'OK',
> 'ansible_task': u'Check VM status at virt level', 'ansible_host':
> u'localhost', 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
> 2018-10-08 23:53:13,434+0100 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7f313caedc10> kwargs
> 2018-10-08 23:53:14,159+0100 INFO ansible task start {'status': 'OK',
> 'ansible_task': u'debug', 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
> 2018-10-08 23:53:14,159+0100 DEBUG ansible on_any args TASK: debug kwargs
> is_conditional:False
> 2018-10-08 23:53:14,861+0100 INFO ansible ok {'status': 'OK',
> 'ansible_task': u'', 'ansible_host': u'localhost', 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
> 2018-10-08 23:53:14,861+0100 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7f313c161ad0> kwargs
> 2018-10-08 23:53:15,575+0100 INFO ansible task start {'status': 'OK',
> 'ansible_task': u'Fail if engine VM is not running', 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
> 2018-10-08 23:53:15,576+0100 DEBUG ansible on_any args TASK: Fail if
> engine VM is not running kwargs is_conditional:False
> 2018-10-08 23:53:16,261+0100 INFO ansible skipped {'status': 'SKIPPED',
> 'ansible_task': u'Fail if engine VM is not running', 'ansible_host':
> u'localhost', 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
> 2018-10-08 23:53:16,261+0100 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7f313ccec510> kwargs
> 2018-10-08 23:53:16,967+0100 INFO ansible task start {'status': 'OK',
> 'ansible_task': u'Get target engine VM IPv4 address', 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
> 2018-10-08 23:53:16,968+0100 DEBUG ansible on_any args TASK: Get target
> engine VM IPv4 address kwargs is_conditional:False
> 2018-10-08 23:53:17,819+0100 DEBUG var changed: host "localhost" var
> "engine_vm_ipv4" type "<type 'dict'>" value: "{
> "changed": true,
> "cmd": "getent ahostsv4 ovirt-engine.example.com | cut -d' ' -f1 |
> uniq",
> "delta": "0:00:00.004903",
> "end": "2018-10-08 23:53:17.143283",
> "failed": false,
> "rc": 0,
> "start": "2018-10-08 23:53:17.138380",
> "stderr": "",
> "stderr_lines": [],
> "stdout": "10.0.0.109",
> "stdout_lines": [
> "10.0.0.109"
> ]
> }"
> 2018-10-08 23:53:17,819+0100 INFO ansible ok {'status': 'OK',
> 'ansible_task': u'Get target engine VM IPv4 address', 'ansible_host':
> u'localhost', 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
> 2018-10-08 23:53:17,819+0100 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7f313caedc10> kwargs
> 2018-10-08 23:53:18,531+0100 INFO ansible task start {'status': 'OK',
> 'ansible_task': u"Get VDSM's target engine VM stats", 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
> 2018-10-08 23:53:18,531+0100 DEBUG ansible on_any args TASK: Get VDSM's
> target engine VM stats kwargs is_conditional:False
> 2018-10-08 23:53:19,620+0100 DEBUG var changed: host "localhost" var
> "engine_vdsm_stats" type "<type 'dict'>" value: "{
> "changed": true,
> "cmd": [
> "vdsm-client",
> "VM",
> "getStats",
> "vmID=e1af6b26-9e48-251-940c-7bfadf920f3f"
> ],
> "delta": "0:00:00.257926",
> "end": "2018-10-08 23:53:18.961394",
> "failed": false,
> "rc": 0,
> "start": "2018-10-08 23:53:18.703468",
> "stderr": "",
> "stderr_lines": [],
> "stdout": "[\n {\n \"displayInfo\": [\n {\n
> \"tlsPort\": \"5901\", \n \"ipAddress\":
> \"10.0.0.171\", \n \"port\": \"5900\", \n
> \"type\": \"spice\"\n }, \n {\n
> \"tlsPort\": \"-1\", \n \"ipAddress\": \"10.0.0.171\", \n
> \"port\": \"5902\", \n \"type\": \"vnc\"\n
> }\n ], \n \"memUsage\": \"18\", \n
> \"acpiEnable\": \"true\", \n \"guestFQDN\": \"
> ovirt-engine.example.com\", \n \"vmId\":
> \"e1af6b26-9e48-251-940c-7bfadf920f3f\", \n \"session\":
> \"Unknown\", \n \"netIfaces\": [\n {\n
> \"name\": \"eth0\", \n \"inet6\": [], \n
> \"inet\": [], \n \"hw\": \"00:18:3d:5b:11:5c\"\n
> }\n ], \n \"timeOffset\": \"0\", \n \"memoryStats\":
> {\n \"swap_out\": 0, \n \
> "majflt\": 0, \n \"minflt\": 198, \n
> \"mem_cached\": \"447696\", \n \"mem_free\": \"13259132\", \n
> \"mem_buffers\": \"2104\", \n \"swap_in\": 0, \n
> \"pageflt\": 198, \n \"mem_total\": \"16258492\", \n
> \"mem_unused\": \"13259132\"\n }, \n \"balloonInfo\":
> {\n \"balloon_max\": \"16777216\", \n
> \"balloon_cur\": \"16777216\", \n \"balloon_target\":
> \"16777216\", \n \"balloon_min\": \"1048576\"\n }, \n
> \"pauseCode\": \"NOERR\", \n \"disksUsage\": [\n {\n
> \"path\": \"/\", \n \"total\": \"6565134336\",
> \n \"used\": \"206525344\", \n \"fs\":
> \"xfs\"\n }, \n {\n \"path\":
> \"/boot\", \n \"total\": \"1063256064\", \n
> \"used\": \"169177088\", \n \"fs\": \"xfs\"\n },
> \n
> {\n \"path\": \"/tmp\", \n
> \"total\": \"2136997888\", \n \"used\": \"33935360\", \n
> \"fs\": \"xfs\"\n }, \n {\n
> \"path\": \"/home\", \n \"total\": \"1063256064\", \n
> \"used\": \"33796096\", \n \"fs\": \"xfs\"\n
> }, \n {\n \"path\": \"/var\", \n
> \"total\": \"21464350720\", \n \"used\": \"456179712\",
> \n \"fs\": \"xfs\"\n }, \n {\n
> \"path\": \"/var/log\", \n \"total\":
> \"10726932480\", \n \"used\": \"43315200\", \n
> \"fs\": \"xfs\"\n }, \n {\n
> \"path\": \"/var/log/audit\", \n \"total\": \"1063256064\",
> \n \"used\": \"3252160\", \n \"fs\":
> \"xfs\"\n }\n ], \n \"network\":
> {\n \"vnet0\": {\n \"macAddr\":
> \"00:18:3d:5b:11:5c\", \n \"rxDropped\": \"0\", \n
> \"tx\": \"1710\", \n \"txDropped\": \"0\", \n
> \"rxErrors\": \"0\", \n \"rx\": \"349635\", \n
> \"txErrors\": \"0\", \n \"state\": \"unknown\", \n
> \"sampleTime\": 4344173.61, \n \"speed\":
> \"1000\", \n \"name\": \"vnet0\"\n }\n },
> \n \"vmType\": \"kvm\", \n \"cpuUser\": \"3.00\", \n
> \"elapsedTime\": \"585\", \n \"vmJobs\": {}, \n \"cpuSys\":
> \"1.27\", \n \"appsList\": [\n
> \"kernel-3.10.0-862.11.6.el7\", \n
> \"cloud-init-0.7.9-24.el7.centos.1\", \n
> \"ovirt-guest-agent-common-1.0.14-1.el7\"\n ], \n
> \"guestOs\": \"3.10.0-862.11.6.el7.x86_64\", \n \"vmName\":
> \"HostedEngine\", \n \"vcpuCount\": \"4\", \n \"has
> h\": \"3205592835746233126\", \n \"lastLogin\": 1539038623.192216,
> \n \"cpuUsage\": \"11670000000\", \n \"vcpuPeriod\": 100000,
> \n \"guestIPs\": \"\", \n \"guestTimezone\": {\n
> \"zone\": \"Europe/London\", \n \"offset\": 0\n }, \n
> \"vcpuQuota\": \"-1\", \n \"guestContainers\": [], \n
> \"kvmEnable\": \"true\", \n \"disks\": {\n \"vda\": {\n
> \"readLatency\": \"0\", \n \"writtenBytes\":
> \"44758528\", \n \"writeOps\": \"1063\", \n
> \"apparentsize\": \"53687091200\", \n \"readOps\":
> \"16151\", \n \"writeLatency\": \"631097\", \n
> \"imageID\": \"758f667c-6e9b-43eb-b09b-c983d78a6374\", \n
> \"readBytes\": \"475618816\", \n \"flushLatency\":
> \"30762\", \n \"readRate\": \"0.0\", \n
> \"truesize\": \"2670891008\", \n \"wr
> iteRate\": \"3276.8\"\n }, \n \"hdc\": {\n
> \"readLatency\": \"0\", \n \"writtenBytes\": \"0\",
> \n \"writeOps\": \"0\", \n \"apparentsize\":
> \"0\", \n \"readOps\": \"4\", \n
> \"writeLatency\": \"0\", \n \"readBytes\": \"152\", \n
> \"flushLatency\": \"0\", \n \"readRate\": \"0.0\",
> \n \"truesize\": \"0\", \n \"writeRate\":
> \"0.0\"\n }\n }, \n \"monitorResponse\": \"0\",
> \n \"guestOsInfo\": {\n \"kernel\":
> \"3.10.0-862.11.6.el7.x86_64\", \n \"type\": \"linux\", \n
> \"version\": \"7.5.1804\", \n \"distribution\": \"CentOS
> Linux\", \n \"arch\": \"x86_64\", \n \"codename\":
> \"Core\"\n }, \n \"username\": \"None\", \n
> \"guestName\": \"ovirt-engine.example.com\", \n \"status\":
> \"Up\",
> \n \"guestCPUCount\": 4, \n \"clientIp\": \"\", \n
> \"statusTime\": \"4344173610\"\n }\n]",
> "stdout_lines": [
> "[",
> " {",
> " \"displayInfo\": [",
> " {",
> " \"tlsPort\": \"5901\", ",
> " \"ipAddress\": \"10.0.0.171\", ",
> " \"port\": \"5900\", ",
> " \"type\": \"spice\"",
> " }, ",
> " {",
> " \"tlsPort\": \"-1\", ",
> " \"ipAddress\": \"10.0.0.171\", ",
> " \"port\": \"5902\", ",
> " \"type\": \"vnc\"",
> " }",
> " ], ",
> " \"memUsage\": \"18\", ",
> " \"acpiEnable\": \"true\", ",
> " \"guestFQDN\": \"ovirt-engine.example.com\", ",
> " \"vmId\": \"e1af6b26-9e48-251-940c-7bfadf920f3f\", ",
> " \"session\": \"Unknown\", ",
> " \"netIfaces\": [",
> " {",
> " \"name\": \"eth0\", ",
> " \"inet6\": [], ",
> " \"inet\": [], ",
> " \"hw\": \"00:18:3d:5b:11:5c\"",
> " }",
> " ], ",
> " \"timeOffset\": \"0\", ",
> " \"memoryStats\": {",
> " \"swap_out\": 0, ",
> " \"majflt\": 0, ",
> " \"minflt\": 198, ",
> " \"mem_cached\": \"447696\", ",
> " \"mem_free\": \"13259132\", ",
> " \"mem_buffers\": \"2104\", ",
> " \"swap_in\": 0, ",
> " \"pageflt\": 198, ",
> " \"mem_total\": \"16258492\", ",
> " \"mem_unused\": \"13259132\"",
> " }, ",
> " \"balloonInfo\": {",
> " \"balloon_max\": \"16777216\", ",
> " \"balloon_cur\": \"16777216\", ",
> " \"balloon_target\": \"16777216\", ",
> " \"balloon_min\": \"1048576\"",
> " }, ",
> " \"pauseCode\": \"NOERR\", ",
> " \"disksUsage\": [",
> " {",
> " \"path\": \"/\", ",
> " \"total\": \"6565134336\", ",
> " \"used\": \"206525344\", ",
> " \"fs\": \"xfs\"",
> " }, ",
> " {",
> " \"path\": \"/boot\", ",
> " \"total\": \"1063256064\", ",
> " \"used\": \"169177088\", ",
> " \"fs\": \"xfs\"",
> " }, ",
> " {",
> " \"path\": \"/tmp\", ",
> " \"total\": \"2136997888\", ",
> " \"used\": \"33935360\", ",
> " \"fs\": \"xfs\"",
> " }, ",
> " {",
> " \"path\": \"/home\", ",
> " \"total\": \"1063256064\", ",
> " \"used\": \"33796096\", ",
> " \"fs\": \"xfs\"",
> " }, ",
> " {",
> " \"path\": \"/var\", ",
> " \"total\": \"21464350720\", ",
> " \"used\": \"456179712\", ",
> " \"fs\": \"xfs\"",
> " }, ",
> " {",
> " \"path\": \"/var/log\", ",
> " \"total\": \"10726932480\", ",
> " \"used\": \"43315200\", ",
> " \"fs\": \"xfs\"",
> " }, ",
> " {",
> " \"path\": \"/var/log/audit\", ",
> " \"total\": \"1063256064\", ",
> " \"used\": \"3252160\", ",
> " \"fs\": \"xfs\"",
> " }",
> " ], ",
> " \"network\": {",
> " \"vnet0\": {",
> " \"macAddr\": \"00:18:3d:5b:11:5c\", ",
> " \"rxDropped\": \"0\", ",
> " \"tx\": \"1710\", ",
> " \"txDropped\": \"0\", ",
> " \"rxErrors\": \"0\", ",
> " \"rx\": \"349635\", ",
> " \"txErrors\": \"0\", ",
> " \"state\": \"unknown\", ",
> " \"sampleTime\": 4344173.61, ",
> " \"speed\": \"1000\", ",
> " \"name\": \"vnet0\"",
> " }",
> " }, ",
> " \"vmType\": \"kvm\", ",
> " \"cpuUser\": \"3.00\", ",
> " \"elapsedTime\": \"585\", ",
> " \"vmJobs\": {}, ",
> " \"cpuSys\": \"1.27\", ",
> " \"appsList\": [",
> " \"kernel-3.10.0-862.11.6.el7\", ",
> " \"cloud-init-0.7.9-24.el7.centos.1\", ",
> " \"ovirt-guest-agent-common-1.0.14-1.el7\"",
> " ], ",
> " \"guestOs\": \"3.10.0-862.11.6.el7.x86_64\", ",
> " \"vmName\": \"HostedEngine\", ",
> " \"vcpuCount\": \"4\", ",
> " \"hash\": \"3205592835746233126\", ",
> " \"lastLogin\": 1539038623.192216, ",
> " \"cpuUsage\": \"11670000000\", ",
> " \"vcpuPeriod\": 100000, ",
> " \"guestIPs\": \"\", ",
> " \"guestTimezone\": {",
> " \"zone\": \"Europe/London\", ",
> " \"offset\": 0",
> " }, ",
> " \"vcpuQuota\": \"-1\", ",
> " \"guestContainers\": [], ",
> " \"kvmEnable\": \"true\", ",
> " \"disks\": {",
> " \"vda\": {",
> " \"readLatency\": \"0\", ",
> " \"writtenBytes\": \"44758528\", ",
> " \"writeOps\": \"1063\", ",
> " \"apparentsize\": \"53687091200\", ",
> " \"readOps\": \"16151\", ",
> " \"writeLatency\": \"631097\", ",
> " \"imageID\":
> \"758f667c-6e9b-43eb-b09b-c983d78a6374\", ",
> " \"readBytes\": \"475618816\", ",
> " \"flushLatency\": \"30762\", ",
> " \"readRate\": \"0.0\", ",
> " \"truesize\": \"2670891008\", ",
> " \"writeRate\": \"3276.8\"",
> " }, ",
> " \"hdc\": {",
> " \"readLatency\": \"0\", ",
> " \"writtenBytes\": \"0\", ",
> " \"writeOps\": \"0\", ",
> " \"apparentsize\": \"0\", ",
> " \"readOps\": \"4\", ",
> " \"writeLatency\": \"0\", ",
> " \"readBytes\": \"152\", ",
> " \"flushLatency\": \"0\", ",
> " \"readRate\": \"0.0\", ",
> " \"truesize\": \"0\", ",
> " \"writeRate\": \"0.0\"",
> " }",
> " }, ",
> " \"monitorResponse\": \"0\", ",
> " \"guestOsInfo\": {",
> " \"kernel\": \"3.10.0-862.11.6.el7.x86_64\", ",
> " \"type\": \"linux\", ",
> " \"version\": \"7.5.1804\", ",
> " \"distribution\": \"CentOS Linux\", ",
> " \"arch\": \"x86_64\", ",
> " \"codename\": \"Core\"",
> " }, ",
> " \"username\": \"None\", ",
> " \"guestName\": \"ovirt-engine.example.com\", ",
> " \"status\": \"Up\", ",
> " \"guestCPUCount\": 4, ",
> " \"clientIp\": \"\", ",
> " \"statusTime\": \"4344173610\"",
> " }",
> "]"
> ]
> }"
> 2018-10-08 23:53:19,620+0100 INFO ansible ok {'status': 'OK',
> 'ansible_task': u"Get VDSM's target engine VM stats", 'ansible_host':
> u'localhost', 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
> 2018-10-08 23:53:19,620+0100 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7f313cc17e10> kwargs
> 2018-10-08 23:53:20,321+0100 INFO ansible task start {'status': 'OK',
> 'ansible_task': u'Convert stats to JSON format', 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
> 2018-10-08 23:53:20,322+0100 DEBUG ansible on_any args TASK: Convert stats
> to JSON format kwargs is_conditional:False
> 2018-10-08 23:53:21,008+0100 DEBUG var changed: host "localhost" var
> "json_stats" type "<type 'list'>" value: "[
> {
> "acpiEnable": "true",
> "appsList": [
> "kernel-3.10.0-862.11.6.el7",
> "cloud-init-0.7.9-24.el7.centos.1",
> "ovirt-guest-agent-common-1.0.14-1.el7"
> ],
> "balloonInfo": {
> "balloon_cur": "16777216",
> "balloon_max": "16777216",
> "balloon_min": "1048576",
> "balloon_target": "16777216"
> },
> "clientIp": "",
> "cpuSys": "1.27",
> "cpuUsage": "11670000000",
> "cpuUser": "3.00",
> "disks": {
> "hdc": {
> "apparentsize": "0",
> "flushLatency": "0",
> "readBytes": "152",
> "readLatency": "0",
> "readOps": "4",
> "readRate": "0.0",
> "truesize": "0",
> "writeLatency": "0",
> "writeOps": "0",
> "writeRate": "0.0",
> "writtenBytes": "0"
> },
> "vda": {
> "apparentsize": "53687091200",
> "flushLatency": "30762",
> "imageID": "758f667c-6e9b-43eb-b09b-c983d78a6374",
> "readBytes": "475618816",
> "readLatency": "0",
> "readOps": "16151",
> "readRate": "0.0",
> "truesize": "2670891008",
> "writeLatency": "631097",
> "writeOps": "1063",
> "writeRate": "3276.8",
> "writtenBytes": "44758528"
> }
> },
> "disksUsage": [
> {
> "fs": "xfs",
> "path": "/",
> "total": "6565134336",
> "used": "206525344"
> },
> {
> "fs": "xfs",
> "path": "/boot",
> "total": "1063256064",
> "used": "169177088"
> },
> {
> "fs": "xfs",
> "path": "/tmp",
> "total": "2136997888",
> "used": "33935360"
> },
> {
> "fs": "xfs",
> "path": "/home",
> "total": "1063256064",
> "used": "33796096"
> },
> {
> "fs": "xfs",
> "path": "/var",
> "total": "21464350720",
> "used": "456179712"
> },
> {
> "fs": "xfs",
> "path": "/var/log",
> "total": "10726932480",
> "used": "43315200"
> },
> {
> "fs": "xfs",
> "path": "/var/log/audit",
> "total": "1063256064",
> "used": "3252160"
> }
> ],
> "displayInfo": [
> {
> "ipAddress": "10.0.0.171",
> "port": "5900",
> "tlsPort": "5901",
> "type": "spice"
> },
> {
> "ipAddress": "10.0.0.171",
> "port": "5902",
> "tlsPort": "-1",
> "type": "vnc"
> }
> ],
> "elapsedTime": "585",
> "guestCPUCount": 4,
> "guestContainers": [],
> "guestFQDN": "ovirt-engine.example.com",
> "guestIPs": "",
> "guestName": "ovirt-engine.example.com",
> "guestOs": "3.10.0-862.11.6.el7.x86_64",
> "guestOsInfo": {
> "arch": "x86_64",
> "codename": "Core",
> "distribution": "CentOS Linux",
> "kernel": "3.10.0-862.11.6.el7.x86_64",
> "type": "linux",
> "version": "7.5.1804"
> },
> "guestTimezone": {
> "offset": 0,
> "zone": "Europe/London"
> },
> "hash": "3205592835746233126",
> "kvmEnable": "true",
> "lastLogin": 1539038623.192216,
> "memUsage": "18",
> "memoryStats": {
> "majflt": 0,
> "mem_buffers": "2104",
> "mem_cached": "447696",
> "mem_free": "13259132",
> "mem_total": "16258492",
> "mem_unused": "13259132",
> "minflt": 198,
> "pageflt": 198,
> "swap_in": 0,
> "swap_out": 0
> },
> "monitorResponse": "0",
> "netIfaces": [
> {
> "hw": "00:18:3d:5b:11:5c",
> "inet": [],
> "inet6": [],
> "name": "eth0"
> }
> ],
> "network": {
> "vnet0": {
> "macAddr": "00:18:3d:5b:11:5c",
> "name": "vnet0",
> "rx": "349635",
> "rxDropped": "0",
> "rxErrors": "0",
> "sampleTime": 4344173.61,
> "speed": "1000",
> "state": "unknown",
> "tx": "1710",
> "txDropped": "0",
> "txErrors": "0"
> }
> },
> "pauseCode": "NOERR",
> "session": "Unknown",
> "status": "Up",
> "statusTime": "4344173610",
> "timeOffset": "0",
> "username": "None",
> "vcpuCount": "4",
> "vcpuPeriod": 100000,
> "vcpuQuota": "-1",
> "vmId": "e1af6b26-9e48-251-940c-7bfadf920f3f",
> "vmJobs": {},
> "vmName": "HostedEngine",
> "vmType": "kvm"
> }
> ]"
> 2018-10-08 23:53:21,009+0100 INFO ansible ok {'status': 'OK',
> 'ansible_task': u'Convert stats to JSON format', 'ansible_host':
> u'localhost', 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
> 2018-10-08 23:53:21,009+0100 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7f313db268d0> kwargs
> 2018-10-08 23:53:21,720+0100 INFO ansible task start {'status': 'OK',
> 'ansible_task': u'Get target engine VM IPv4 address from VDSM stats',
> 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
> 2018-10-08 23:53:21,720+0100 DEBUG ansible on_any args TASK: Get target
> engine VM IPv4 address from VDSM stats kwargs is_conditional:False
> 2018-10-08 23:53:22,397+0100 DEBUG var changed: host "localhost" var
> "engine_vm_ip_vdsm" type "<class
> 'ansible.utils.unsafe_proxy.AnsibleUnsafeText'>" value: """"
> 2018-10-08 23:53:22,398+0100 INFO ansible ok {'status': 'OK',
> 'ansible_task': u'Get target engine VM IPv4 address from VDSM stats',
> 'ansible_host': u'localhost', 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
> 2018-10-08 23:53:22,398+0100 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7f313caed510> kwargs
> 2018-10-08 23:53:23,124+0100 INFO ansible task start {'status': 'OK',
> 'ansible_task': u'debug', 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
> 2018-10-08 23:53:23,124+0100 DEBUG ansible on_any args TASK: debug kwargs
> is_conditional:False
> 2018-10-08 23:53:23,807+0100 INFO ansible ok {'status': 'OK',
> 'ansible_task': u'', 'ansible_host': u'localhost', 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
> 2018-10-08 23:53:23,807+0100 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7f313ca343d0> kwargs
> 2018-10-08 23:53:24,511+0100 INFO ansible task start {'status': 'OK',
> 'ansible_task': u'Fail if the Engine has no IP address',
> 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'task'}
> 2018-10-08 23:53:24,511+0100 DEBUG ansible on_any args TASK: Fail if the
> Engine has no IP address kwargs is_conditional:False
> 2018-10-08 23:53:25,152+0100 DEBUG var changed: host "localhost" var
> "ansible_play_hosts" type "<type 'list'>" value: "[]"
> 2018-10-08 23:53:25,152+0100 DEBUG var changed: host "localhost" var
> "play_hosts" type "<type 'list'>" value: "[]"
> 2018-10-08 23:53:25,152+0100 DEBUG var changed: host "localhost" var
> "ansible_play_batch" type "<type 'list'>" value: "[]"
> 2018-10-08 23:53:25,153+0100 ERROR ansible failed {'status': 'FAILED',
> 'ansible_type': 'task', 'ansible_task': u'Fail if the Engine has no IP
> address', 'ansible_result': u"type: <type 'dict'>\nstr: {'msg': u'Engine VM
> has no IP address. Please check your network configuration', 'changed':
> False, '_ansible_no_log': False}", 'ansible_host': u'localhost',
> 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml'}
> 2018-10-08 23:53:25,153+0100 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7f313caed6d0> kwargs
> ignore_errors:None
> 2018-10-08 23:53:25,154+0100 INFO ansible stats {'status': 'FAILED',
> 'ansible_playbook_duration': 1064.367312, 'ansible_result': u"type: <type
> 'dict'>\nstr: {u'ovirt-engine.example.com': {'unreachable': 0, 'skipped':
> 0, 'ok': 11, 'changed': 6, 'failures': 0}, u'localhost': {'unreachable': 0,
> 'skipped': 5, 'ok': 82, 'changed': 30, 'failures': 2}}",
> 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml',
> 'ansible_type': 'finish'}
> 2018-10-08 23:53:25,154+0100 DEBUG ansible on_any args
> <ansible.executor.stats.AggregateStats object at 0x7f313ef6c290> kwargs
>
>
> I run it again (I clear the shared NFS storage each attempt), then get the
> following output to console:
> [ INFO ] TASK [Undefine leftover engine VM]
> [ ERROR ] fatal: [localhost]: FAILED! => {"changed": true, "cmd":
> ["virsh", "undefine", "--managed-save", "HostedEngine"], "delta":
> "0:00:00.050407", "end": "2018-10-09 00:16:58.085198", "msg": "non-zero
> return code", "rc": 1, "start": "2018-10-09 00:16:58.034791", "stderr":
> "error: Failed to undefine domain HostedEngine\nerror: Requested operation
> is not valid: cannot undefine transient domain", "stderr_lines": ["error:
> Failed to undefine domain HostedEngine", "error: Requested operation is not
> valid: cannot undefine transient domain"], "stdout": "", "stdout_lines": []}
> _______________________________________________
> Users mailing list -- ***@ovirt.org
> To unsubscribe send an email to users-***@ovirt.org
> Privacy Statement: https://www.ovirt.org/site/privacy-policy/
> oVirt Code of Conduct:
> https://www.ovirt.org/community/about/community-guidelines/
> List Archives:
> https://lists.ovirt.org/archives/list/***@ovirt.org/message/FLSQBNEQSC6IRYHPGFTPHAAWI6RNLKU3/
>
m***@brendanh.com
2018-10-09 14:52:53 UTC
Permalink
I'ved added a record to the DNS server here:
ovirt-engine.example.com 10.0.0.109

This IP address is on the physical network that the host is on (host is on 10.0.0.171). I trust this is correct and I should not resolve to a natted IP instead. I notice that regardless of this record, the name ovirt-engine.example.com resolves to a natted IP: 192.168.124.51 because the ansible script adds an entry to /etc/hosts:
192.168.124.51 ovirt-engine.example.com
While the script is running, if I I can successfully ping ovirt-engine.example.com, it responds on 192.168.124.51. So as you say: "host can correctly resolve the name of the engine VM", but it's not the DNS record's IP. If I remove the DNS record and run hosted-engine --deploy, I get error:
[ ERROR ] Host name is not valid: ovirt-engine.example.com did not resolve into an IP address

Anyway, I added back the DNS record and ran hosted-engine --deploy command, it failed at:
[ INFO ] TASK [Clean /etc/hosts on the host]
[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n"}

To debug, I added tasks to create_target_vm.yml that output the values of local_vm_ip.std_out_lines[0] and FQDN that are used in this task, then ran the usual deploy command again. They are both localhost:
[ INFO ] TASK [show local_vm_ip.std_out_lines[0] that will be written to etc hosts]
[ INFO ] ok: [localhost]
[ INFO ] TASK [show FQDN]
[ INFO ] ok: [localhost]

This time, it gets past [Clean /etc/hosts on the host], but hangs at [ INFO ] TASK [Check engine VM health] same as before. I catted /etc/hosts while it was hanging and it contains:
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost localhost.localdomain localhost6 localhost6.localdomain6

The ovirt-engine.example.com has been deleted! I pinged ovirt-engine.example.com and it now resolves to its IP on the physical network: 10.0.0.109. So I added back this /etc/hosts entry:
192.168.124.51 ovirt-engine.example.com
It subsequently errored:
[ ERROR ] fatal: [localhost]: FAILED! => {"attempts": 120, "changed": true, "cmd": ["hosted-engine", "--vm-status", "--json"], "delta": "0:00:00.167559", "end": "2018-10-09 15:43:41.947274", "rc": 0, "start": "2018-10-09 15:43:41.779715", "stderr": "", "stderr_lines": [], "stdout": "{\"1\": {\"conf_on_shared_storage\": true, \"live-data\": true, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=6810 (Tue Oct 9 15:43:36 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=6810 (Tue Oct 9 15:43:37 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\", \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\": \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\", \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\": false, \"crc32\": \"c5d76f8b\", \"local_conf_timestamp\": 6810, \"host-ts\": 6810}, \"global_maintenance\": false}", "stdout_lines": ["{\"1\": {\"conf_
on_shared_storage\": true, \"live-data\": true, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=6810 (Tue Oct 9 15:43:36 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=6810 (Tue Oct 9 15:43:37 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\", \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\": \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\", \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\": false, \"crc32\": \"c5d76f8b\", \"local_conf_timestamp\": 6810, \"host-ts\": 6810}, \"global_maintenance\": false}"]}

How can I check the hosted-engine's IP address to ensure name resolution is correct?
_______________________________________________
Users mailing list -- ***@ovirt.org
To unsubscribe send an email to users-***@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: https://www.ovirt.org/community/about/community-guidelines/
List Archives: https://lists.ovirt.org/archives/list/***@ovirt.org/messa
Simone Tiraboschi
2018-10-09 15:51:01 UTC
Permalink
On Tue, Oct 9, 2018 at 4:54 PM <***@brendanh.com> wrote:

> I'ved added a record to the DNS server here:
> ovirt-engine.example.com 10.0.0.109
>

OK, and how the engine VM will get that address?
Are you using DHCP? do you have a DHCP reservation for the MAC address you
are using on the engine VM?
Are you configuring it with a static IP?


>
> This IP address is on the physical network that the host is on (host is on
> 10.0.0.171). I trust this is correct and I should not resolve to a natted
> IP instead. I notice that regardless of this record, the name
> ovirt-engine.example.com resolves to a natted IP: 192.168.124.51 because
> the ansible script adds an entry to /etc/hosts:
> 192.168.124.51 ovirt-engine.example.com
> While the script is running, if I I can successfully ping
> ovirt-engine.example.com, it responds on 192.168.124.51. So as you say:
> "host can correctly resolve the name of the engine VM", but it's not the
> DNS record's IP. If I remove the DNS record and run hosted-engine
> --deploy, I get error:
> [ ERROR ] Host name is not valid: ovirt-engine.example.com did not
> resolve into an IP address
>
> Anyway, I added back the DNS record and ran hosted-engine --deploy
> command, it failed at:
> [ INFO ] TASK [Clean /etc/hosts on the host]
> [ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an
> option with an undefined variable. The error was: list object has no
> element 0\n\nThe error appears to have been in
> '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line
> 396, column 5, but may\nbe elsewhere in the file depending on the exact
> syntax problem.\n\nThe offending line appears to be:\n\n changed_when:
> True\n - name: Clean /etc/hosts on the host\n ^ here\n"}
>
> To debug, I added tasks to create_target_vm.yml that output the values of
> local_vm_ip.std_out_lines[0] and FQDN that are used in this task, then ran
> the usual deploy command again. They are both localhost:
> [ INFO ] TASK [show local_vm_ip.std_out_lines[0] that will be written to
> etc hosts]
> [ INFO ] ok: [localhost]
> [ INFO ] TASK [show FQDN]
> [ INFO ] ok: [localhost]
>
> This time, it gets past [Clean /etc/hosts on the host], but hangs at [
> INFO ] TASK [Check engine VM health] same as before.


This is fine, the bootstrap local VM runs over a natted network then, once
ready it will be shutdown and moved to the shared storage. At that point it
will be restarted on your management network.


> I catted /etc/hosts while it was hanging and it contains:
> 127.0.0.1 localhost localhost.localdomain localhost4
> localhost4.localdomain4
> ::1 localhost localhost.localdomain localhost6
> localhost6.localdomain6
>
> The ovirt-engine.example.com has been deleted! I pinged
> ovirt-engine.example.com and it now resolves to its IP on the physical
> network: 10.0.0.109. So I added back this /etc/hosts entry:
> 192.168.124.51 ovirt-engine.example.com


Please avoid this.


>
> It subsequently errored:
> [ ERROR ] fatal: [localhost]: FAILED! => {"attempts": 120, "changed":
> true, "cmd": ["hosted-engine", "--vm-status", "--json"], "delta":
> "0:00:00.167559", "end": "2018-10-09 15:43:41.947274", "rc": 0, "start":
> "2018-10-09 15:43:41.779715", "stderr": "", "stderr_lines": [], "stdout":
> "{\"1\": {\"conf_on_shared_storage\": true, \"live-data\": true, \"extra\":
> \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=6810
> (Tue Oct 9 15:43:36
> 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=6810 (Tue Oct 9
> 15:43:37
> 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\",
> \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\":
> \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\",
> \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\":
> false, \"crc32\": \"c5d76f8b\", \"local_conf_timestamp\": 6810,
> \"host-ts\": 6810}, \"global_maintenance\": false}", "stdout_lines":
> ["{\"1\": {\"conf_
> on_shared_storage\": true, \"live-data\": true, \"extra\":
> \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=6810
> (Tue Oct 9 15:43:36
> 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=6810 (Tue Oct 9
> 15:43:37
> 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\",
> \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\":
> \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\",
> \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\":
> false, \"crc32\": \"c5d76f8b\", \"local_conf_timestamp\": 6810,
> \"host-ts\": 6810}, \"global_maintenance\": false}"]}
>
> How can I check the hosted-engine's IP address to ensure name resolution
> is correct?
>

You can connect to that VM with VNC and check the IP there.


> _______________________________________________
> Users mailing list -- ***@ovirt.org
> To unsubscribe send an email to users-***@ovirt.org
> Privacy Statement: https://www.ovirt.org/site/privacy-policy/
> oVirt Code of Conduct:
> https://www.ovirt.org/community/about/community-guidelines/
> List Archives:
> https://lists.ovirt.org/archives/list/***@ovirt.org/message/SVBXIBLS5TSP7SZROSSE6JD5ICBZLV3E/
>
Brendan Holmes
2018-10-09 21:49:51 UTC
Permalink
Hi Simone,



Yes the MAC address in answers.conf: OVEHOSTED_VM/vmMACAddr=

is added as a reservation to the DHCP server, so in theory 10.0.0.109 should be assigned.



However perhaps DHCP is not working. I have just changed to a static IP instead:

OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24

(let me know if this isn’t the correct way)



My host fails to get an IP automatically from this DHCP server, so it is quite possible engine’s DHCP has been failing too. Each time the host boots, I must type dhclient in order to receive an IP address. Anyway, after changing this and re-running hosted-engine –deploy, failed due to:



[ INFO ] TASK [Copy local VM disk to shared storage]

[ INFO ] changed: [localhost]

[ INFO ] TASK [show local_vm_ip.std_out_lines[0] that will be written to etc hosts]

[ INFO ] ok: [localhost]

[ INFO ] TASK [show FQDN]

[ INFO ] ok: [localhost]

[ INFO ] TASK [Clean /etc/hosts on the host]

[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 400, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n debug: var=FQDN\n - name: Clean /etc/hosts on the host\n ^ here\n"}



I have just tried deploying using the webui, same error. I suspect the “undefined variable” is local_vm_ip.std_out_lines[0]. My new debug task that tries to output this is:

- name: show local_vm_ip.std_out_lines[0] that will be written to etc hosts

debug: var=local_vm_ip.stdout_lines[0]



You can see the output of this above. I think I was mistaken to suggest the value of this is localhost. Localhost is just the machine this task ran on. I don’t think list local_vm_ip.std_out_lines is defined. Any more ideas?



Many thanks



From: Simone Tiraboschi <***@redhat.com>
Sent: 09 October 2018 16:51
To: B Holmes <***@brendanh.com>
Cc: users <***@ovirt.org>
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes





On Tue, Oct 9, 2018 at 4:54 PM <***@brendanh.com <mailto:***@brendanh.com> > wrote:

I'ved added a record to the DNS server here:
ovirt-engine.example.com <http://ovirt-engine.example.com> 10.0.0.109



OK, and how the engine VM will get that address?

Are you using DHCP? do you have a DHCP reservation for the MAC address you are using on the engine VM?

Are you configuring it with a static IP?




This IP address is on the physical network that the host is on (host is on 10.0.0.171). I trust this is correct and I should not resolve to a natted IP instead. I notice that regardless of this record, the name ovirt-engine.example.com <http://ovirt-engine.example.com> resolves to a natted IP: 192.168.124.51 because the ansible script adds an entry to /etc/hosts:
192.168.124.51 ovirt-engine.example.com <http://ovirt-engine.example.com>
While the script is running, if I I can successfully ping ovirt-engine.example.com <http://ovirt-engine.example.com> , it responds on 192.168.124.51. So as you say: "host can correctly resolve the name of the engine VM", but it's not the DNS record's IP. If I remove the DNS record and run hosted-engine --deploy, I get error:
[ ERROR ] Host name is not valid: ovirt-engine.example.com <http://ovirt-engine.example.com> did not resolve into an IP address

Anyway, I added back the DNS record and ran hosted-engine --deploy command, it failed at:
[ INFO ] TASK [Clean /etc/hosts on the host]
[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n"}

To debug, I added tasks to create_target_vm.yml that output the values of local_vm_ip.std_out_lines[0] and FQDN that are used in this task, then ran the usual deploy command again. They are both localhost:
[ INFO ] TASK [show local_vm_ip.std_out_lines[0] that will be written to etc hosts]
[ INFO ] ok: [localhost]
[ INFO ] TASK [show FQDN]
[ INFO ] ok: [localhost]

This time, it gets past [Clean /etc/hosts on the host], but hangs at [ INFO ] TASK [Check engine VM health] same as before.



This is fine, the bootstrap local VM runs over a natted network then, once ready it will be shutdown and moved to the shared storage. At that point it will be restarted on your management network.



I catted /etc/hosts while it was hanging and it contains:
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost localhost.localdomain localhost6 localhost6.localdomain6

The ovirt-engine.example.com <http://ovirt-engine.example.com> has been deleted! I pinged ovirt-engine.example.com <http://ovirt-engine.example.com> and it now resolves to its IP on the physical network: 10.0.0.109. So I added back this /etc/hosts entry:
192.168.124.51 ovirt-engine.example.com <http://ovirt-engine.example.com>



Please avoid this.




It subsequently errored:
[ ERROR ] fatal: [localhost]: FAILED! => {"attempts": 120, "changed": true, "cmd": ["hosted-engine", "--vm-status", "--json"], "delta": "0:00:00.167559", "end": "2018-10-09 15:43:41.947274", "rc": 0, "start": "2018-10-09 15:43:41.779715", "stderr": "", "stderr_lines": [], "stdout": "{\"1\": {\"conf_on_shared_storage\": true, \"live-data\": true, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=6810 (Tue Oct 9 15:43:36 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=6810 <file://nhost-id=1/nscore=3400/nvm_conf_refresh_time=6810> (Tue Oct 9 15:43:37 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\ <file://nconf_on_shared_storage=True/nmaintenance=False/nstate=EngineStarting/nstopped=False/n/> ", \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\": \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\", \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\": false, \"crc32\": \"c5d76f8b\", \"local_conf_timestamp\": 6810, \"host-ts\": 6810}, \"global_maintenance\": false}", "stdout_lines": ["{\"1\": {\"conf_
on_shared_storage\": true, \"live-data\": true, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=6810 (Tue Oct 9 15:43:36 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=6810 <file://nhost-id=1/nscore=3400/nvm_conf_refresh_time=6810> (Tue Oct 9 15:43:37 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\ <file://nconf_on_shared_storage=True/nmaintenance=False/nstate=EngineStarting/nstopped=False/n/> ", \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\": \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\", \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\": false, \"crc32\": \"c5d76f8b\", \"local_conf_timestamp\": 6810, \"host-ts\": 6810}, \"global_maintenance\": false}"]}

How can I check the hosted-engine's IP address to ensure name resolution is correct?



You can connect to that VM with VNC and check the IP there.



_______________________________________________
Users mailing list -- ***@ovirt.org <mailto:***@ovirt.org>
To unsubscribe send an email to users-***@ovirt.org <mailto:users-***@ovirt.org>
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: https://www.ovirt.org/community/about/community-guidelines/
List Archives: https://lists.ovirt.org/archives/list/***@ovirt.org/message/SVBXIBLS5TSP7SZROSSE6JD5ICBZLV3E/
Simone Tiraboschi
2018-10-10 11:05:59 UTC
Permalink
On Tue, Oct 9, 2018 at 11:50 PM Brendan Holmes <***@brendanh.com> wrote:

> Hi Simone,
>
>
>
> Yes the MAC address in answers.conf: OVEHOSTED_VM/vmMACAddr=
>
> is added as a reservation to the DHCP server, so in theory 10.0.0.109
> should be assigned.
>
>
>
> However perhaps DHCP is not working. I have just changed to a static IP
> instead:
>
> OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24
>
> (let me know if this isn’t the correct way)
>
>
>
> My host fails to get an IP automatically from this DHCP server, so it is
> quite possible engine’s DHCP has been failing too. Each time the host
> boots, I must type dhclient in order to receive an IP address. Anyway,
> after changing this and re-running hosted-engine –deploy, failed due to:
>
>
>
> [ INFO ] TASK [Copy local VM disk to shared storage]
>
> [ INFO ] changed: [localhost]
>
> [ INFO ] TASK [show local_vm_ip.std_out_lines[0] that will be written to
> etc hosts]
>
> [ INFO ] ok: [localhost]
>
> [ INFO ] TASK [show FQDN]
>
> [ INFO ] ok: [localhost]
>
> [ INFO ] TASK [Clean /etc/hosts on the host]
>
> [ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an
> option with an undefined variable. The error was: list object has no
> element 0\n\nThe error appears to have been in
> '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line
> 400, column 5, but may\nbe elsewhere in the file depending on the exact
> syntax problem.\n\nThe offending line appears to be:\n\n debug:
> var=FQDN\n - name: Clean /etc/hosts on the host\n ^ here\n"}
>
>
>
> I have just tried deploying using the webui, same error. I suspect the
> “undefined variable” is local_vm_ip.std_out_lines[0]. My new debug task
> that tries to output this is:
>
> - name: show local_vm_ip.std_out_lines[0] that will be written to etc
> hosts
>
> debug: var=local_vm_ip.stdout_lines[0]
>
>
>
> You can see the output of this above. I think I was mistaken to suggest
> the value of this is localhost. Localhost is just the machine this task
> ran on. I don’t think list local_vm_ip.std_out_lines is defined. Any more
> ideas?
>

The issue is on a task that isn't part of the code we are shipping.
I can just suggest to simply reinstall the rpm to get rid of any
modification and restart from scratch deploying with a static IP if your
DHCP server is not properly working.


>
>
> Many thanks
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 09 October 2018 16:51
> *To:* B Holmes <***@brendanh.com>
> *Cc:* users <***@ovirt.org>
> *Subject:* Re: [ovirt-users] Re: Diary of hosted engine install woes
>
>
>
>
>
> On Tue, Oct 9, 2018 at 4:54 PM <***@brendanh.com> wrote:
>
> I'ved added a record to the DNS server here:
> ovirt-engine.example.com 10.0.0.109
>
>
>
> OK, and how the engine VM will get that address?
>
> Are you using DHCP? do you have a DHCP reservation for the MAC address you
> are using on the engine VM?
>
> Are you configuring it with a static IP?
>
>
>
>
> This IP address is on the physical network that the host is on (host is on
> 10.0.0.171). I trust this is correct and I should not resolve to a natted
> IP instead. I notice that regardless of this record, the name
> ovirt-engine.example.com resolves to a natted IP: 192.168.124.51 because
> the ansible script adds an entry to /etc/hosts:
> 192.168.124.51 ovirt-engine.example.com
> While the script is running, if I I can successfully ping
> ovirt-engine.example.com, it responds on 192.168.124.51. So as you say:
> "host can correctly resolve the name of the engine VM", but it's not the
> DNS record's IP. If I remove the DNS record and run hosted-engine
> --deploy, I get error:
> [ ERROR ] Host name is not valid: ovirt-engine.example.com did not
> resolve into an IP address
>
> Anyway, I added back the DNS record and ran hosted-engine --deploy
> command, it failed at:
> [ INFO ] TASK [Clean /etc/hosts on the host]
> [ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an
> option with an undefined variable. The error was: list object has no
> element 0\n\nThe error appears to have been in
> '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line
> 396, column 5, but may\nbe elsewhere in the file depending on the exact
> syntax problem.\n\nThe offending line appears to be:\n\n changed_when:
> True\n - name: Clean /etc/hosts on the host\n ^ here\n"}
>
> To debug, I added tasks to create_target_vm.yml that output the values of
> local_vm_ip.std_out_lines[0] and FQDN that are used in this task, then ran
> the usual deploy command again. They are both localhost:
> [ INFO ] TASK [show local_vm_ip.std_out_lines[0] that will be written to
> etc hosts]
> [ INFO ] ok: [localhost]
> [ INFO ] TASK [show FQDN]
> [ INFO ] ok: [localhost]
>
> This time, it gets past [Clean /etc/hosts on the host], but hangs at [
> INFO ] TASK [Check engine VM health] same as before.
>
>
>
> This is fine, the bootstrap local VM runs over a natted network then, once
> ready it will be shutdown and moved to the shared storage. At that point it
> will be restarted on your management network.
>
>
>
> I catted /etc/hosts while it was hanging and it contains:
> 127.0.0.1 localhost localhost.localdomain localhost4
> localhost4.localdomain4
> ::1 localhost localhost.localdomain localhost6
> localhost6.localdomain6
>
> The ovirt-engine.example.com has been deleted! I pinged
> ovirt-engine.example.com and it now resolves to its IP on the physical
> network: 10.0.0.109. So I added back this /etc/hosts entry:
> 192.168.124.51 ovirt-engine.example.com
>
>
>
> Please avoid this.
>
>
>
>
> It subsequently errored:
> [ ERROR ] fatal: [localhost]: FAILED! => {"attempts": 120, "changed":
> true, "cmd": ["hosted-engine", "--vm-status", "--json"], "delta":
> "0:00:00.167559", "end": "2018-10-09 15:43:41.947274", "rc": 0, "start":
> "2018-10-09 15:43:41.779715", "stderr": "", "stderr_lines": [], "stdout":
> "{\"1\": {\"conf_on_shared_storage\": true, \"live-data\": true, \"extra\":
> \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=6810
> (Tue Oct 9 15:43:36 2018)
> \\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=6810 (Tue Oct 9
> 15:43:37 2018)
> \\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\",
> \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\":
> \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\",
> \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\":
> false, \"crc32\": \"c5d76f8b\", \"local_conf_timestamp\": 6810,
> \"host-ts\": 6810}, \"global_maintenance\": false}", "stdout_lines":
> ["{\"1\": {\"conf_
> on_shared_storage\": true, \"live-data\": true, \"extra\":
> \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=6810
> (Tue Oct 9 15:43:36 2018)
> \\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=6810 (Tue Oct 9
> 15:43:37 2018)
> \\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\",
> \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\":
> \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\",
> \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\":
> false, \"crc32\": \"c5d76f8b\", \"local_conf_timestamp\": 6810,
> \"host-ts\": 6810}, \"global_maintenance\": false}"]}
>
> How can I check the hosted-engine's IP address to ensure name resolution
> is correct?
>
>
>
> You can connect to that VM with VNC and check the IP there.
>
>
>
> _______________________________________________
> Users mailing list -- ***@ovirt.org
> To unsubscribe send an email to users-***@ovirt.org
> Privacy Statement: https://www.ovirt.org/site/privacy-policy/
> oVirt Code of Conduct:
> https://www.ovirt.org/community/about/community-guidelines/
> List Archives:
> https://lists.ovirt.org/archives/list/***@ovirt.org/message/SVBXIBLS5TSP7SZROSSE6JD5ICBZLV3E/
>
>
Brendan Holmes
2018-10-13 22:08:30 UTC
Permalink
Hi Simone,



“restart from scratch deploying with a static IP”. Okay, I have reinstalled the host using oVirt Node from scratch. I am assigning a static IP using the attached answers.conf which contains:

OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24



create_target_vm.yml and all other RedHat code is as-shipped. I’m getting the same error:



[ INFO ] TASK [Copy /etc/hosts back to the Hosted Engine VM]

[ INFO ] changed: [localhost]

[ INFO ] TASK [Copy local VM disk to shared storage]

[ INFO ] changed: [localhost]

[ INFO ] TASK [Clean /etc/hosts on the host]

[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n"}



Any ideas? How can I debug this failure to assign an IP (undefined variable)?



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 10 October 2018 12:06
To: B Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Cc: users <***@ovirt.org <mailto:***@ovirt.org> >
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes





On Tue, Oct 9, 2018 at 11:50 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



Yes the MAC address in answers.conf: OVEHOSTED_VM/vmMACAddr=

is added as a reservation to the DHCP server, so in theory 10.0.0.109 should be assigned.



However perhaps DHCP is not working. I have just changed to a static IP instead:

OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24 <http://10.0.0.109/24>

(let me know if this isn’t the correct way)



My host fails to get an IP automatically from this DHCP server, so it is quite possible engine’s DHCP has been failing too. Each time the host boots, I must type dhclient in order to receive an IP address. Anyway, after changing this and re-running hosted-engine –deploy, failed due to:



[ INFO ] TASK [Copy local VM disk to shared storage]

[ INFO ] changed: [localhost]

[ INFO ] TASK [show local_vm_ip.std_out_lines[0] that will be written to etc hosts]

[ INFO ] ok: [localhost]

[ INFO ] TASK [show FQDN]

[ INFO ] ok: [localhost]

[ INFO ] TASK [Clean /etc/hosts on the host]

[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 400, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n debug: var=FQDN\n - name: Clean /etc/hosts on the host\n ^ here\n"}



I have just tried deploying using the webui, same error. I suspect the “undefined variable” is local_vm_ip.std_out_lines[0]. My new debug task that tries to output this is:

- name: show local_vm_ip.std_out_lines[0] that will be written to etc hosts

debug: var=local_vm_ip.stdout_lines[0]



You can see the output of this above. I think I was mistaken to suggest the value of this is localhost. Localhost is just the machine this task ran on. I don’t think list local_vm_ip.std_out_lines is defined. Any more ideas?



The issue is on a task that isn't part of the code we are shipping.

I can just suggest to simply reinstall the rpm to get rid of any modification and restart from scratch deploying with a static IP if your DHCP server is not properly working.





Many thanks



From: Simone Tiraboschi < <mailto:***@redhat.com> ***@redhat.com>
Sent: 09 October 2018 16:51
To: B Holmes < <mailto:***@brendanh.com> ***@brendanh.com>
Cc: users < <mailto:***@ovirt.org> ***@ovirt.org>
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes





On Tue, Oct 9, 2018 at 4:54 PM <***@brendanh.com <mailto:***@brendanh.com> > wrote:

I'ved added a record to the DNS server here:
ovirt-engine.example.com <http://ovirt-engine.example.com> 10.0.0.109



OK, and how the engine VM will get that address?

Are you using DHCP? do you have a DHCP reservation for the MAC address you are using on the engine VM?

Are you configuring it with a static IP?




This IP address is on the physical network that the host is on (host is on 10.0.0.171). I trust this is correct and I should not resolve to a natted IP instead. I notice that regardless of this record, the name ovirt-engine.example.com <http://ovirt-engine.example.com> resolves to a natted IP: 192.168.124.51 because the ansible script adds an entry to /etc/hosts:
192.168.124.51 ovirt-engine.example.com <http://ovirt-engine.example.com>
While the script is running, if I I can successfully ping ovirt-engine.example.com <http://ovirt-engine.example.com> , it responds on 192.168.124.51. So as you say: "host can correctly resolve the name of the engine VM", but it's not the DNS record's IP. If I remove the DNS record and run hosted-engine --deploy, I get error:
[ ERROR ] Host name is not valid: ovirt-engine.example.com <http://ovirt-engine.example.com> did not resolve into an IP address

Anyway, I added back the DNS record and ran hosted-engine --deploy command, it failed at:
[ INFO ] TASK [Clean /etc/hosts on the host]
[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n"}

To debug, I added tasks to create_target_vm.yml that output the values of local_vm_ip.std_out_lines[0] and FQDN that are used in this task, then ran the usual deploy command again. They are both localhost:
[ INFO ] TASK [show local_vm_ip.std_out_lines[0] that will be written to etc hosts]
[ INFO ] ok: [localhost]
[ INFO ] TASK [show FQDN]
[ INFO ] ok: [localhost]

This time, it gets past [Clean /etc/hosts on the host], but hangs at [ INFO ] TASK [Check engine VM health] same as before.



This is fine, the bootstrap local VM runs over a natted network then, once ready it will be shutdown and moved to the shared storage. At that point it will be restarted on your management network.



I catted /etc/hosts while it was hanging and it contains:
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost localhost.localdomain localhost6 localhost6.localdomain6

The ovirt-engine.example.com <http://ovirt-engine.example.com> has been deleted! I pinged ovirt-engine.example.com <http://ovirt-engine.example.com> and it now resolves to its IP on the physical network: 10.0.0.109. So I added back this /etc/hosts entry:
192.168.124.51 ovirt-engine.example.com <http://ovirt-engine.example.com>



Please avoid this.




It subsequently errored:
[ ERROR ] fatal: [localhost]: FAILED! => {"attempts": 120, "changed": true, "cmd": ["hosted-engine", "--vm-status", "--json"], "delta": "0:00:00.167559", "end": "2018-10-09 15:43:41.947274", "rc": 0, "start": "2018-10-09 15:43:41.779715", "stderr": "", "stderr_lines": [], "stdout": "{\"1\": {\"conf_on_shared_storage\": true, \"live-data\": true, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=6810 (Tue Oct 9 15:43:36 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=6810 <file://nhost-id=1/nscore=3400/nvm_conf_refresh_time=6810> (Tue Oct 9 15:43:37 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\ <file://nconf_on_shared_storage=True/nmaintenance=False/nstate=EngineStarting/nstopped=False/n/> ", \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\": \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\", \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\": false, \"crc32\": \"c5d76f8b\", \"local_conf_timestamp\": 6810, \"host-ts\": 6810}, \"global_maintenance\": false}", "stdout_lines": ["{\"1\": {\"conf_
on_shared_storage\": true, \"live-data\": true, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=6810 (Tue Oct 9 15:43:36 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=6810 <file://nhost-id=1/nscore=3400/nvm_conf_refresh_time=6810> (Tue Oct 9 15:43:37 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\ <file://nconf_on_shared_storage=True/nmaintenance=False/nstate=EngineStarting/nstopped=False/n/> ", \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\": \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\", \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\": false, \"crc32\": \"c5d76f8b\", \"local_conf_timestamp\": 6810, \"host-ts\": 6810}, \"global_maintenance\": false}"]}

How can I check the hosted-engine's IP address to ensure name resolution is correct?



You can connect to that VM with VNC and check the IP there.



_______________________________________________
Users mailing list -- ***@ovirt.org <mailto:***@ovirt.org>
To unsubscribe send an email to users-***@ovirt.org <mailto:users-***@ovirt.org>
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: https://www.ovirt.org/community/about/community-guidelines/
List Archives: https://lists.ovirt.org/archives/list/***@ovirt.org/message/SVBXIBLS5TSP7SZROSSE6JD5ICBZLV3E/
Simone Tiraboschi
2018-10-15 03:49:16 UTC
Permalink
On Sun, Oct 14, 2018 at 6:08 AM Brendan Holmes <***@brendanh.com> wrote:

> Hi Simone,
>
>
>
> “restart from scratch deploying with a static IP”. Okay, I have
> reinstalled the host using oVirt Node from scratch. I am assigning a
> static IP using the attached answers.conf which contains:
>
> OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24
>
>
>
> create_target_vm.yml and all other RedHat code is as-shipped. I’m getting
> the same error:
>
>
>
> [ INFO ] TASK [Copy /etc/hosts back to the Hosted Engine VM]
>
> [ INFO ] changed: [localhost]
>
> [ INFO ] TASK [Copy local VM disk to shared storage]
>
> [ INFO ] changed: [localhost]
>
> [ INFO ] TASK [Clean /etc/hosts on the host]
>
> [ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an
> option with an undefined variable. The error was: list object has no
> element 0\n\nThe error appears to have been in
> '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line
> 396, column 5, but may\nbe elsewhere in the file depending on the exact
> syntax problem.\n\nThe offending line appears to be:\n\n changed_when:
> True\n - name: Clean /etc/hosts on the host\n ^ here\n"}
>
>
I think that this is just an error in the clean up on failures phase due to
something bad before.
Can you please attach the whole deployment log from
/var/log/ovirt-hosted-engine/hosted-engine.log ?


>
>
> Any ideas? How can I debug this failure to assign an IP (undefined
> variable)?
>
>
>
> Many thanks,
>
> Brendan
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 10 October 2018 12:06
> *To:* B Holmes <***@brendanh.com>
> *Cc:* users <***@ovirt.org>
> *Subject:* Re: [ovirt-users] Re: Diary of hosted engine install woes
>
>
>
>
>
> On Tue, Oct 9, 2018 at 11:50 PM Brendan Holmes <***@brendanh.com> wrote:
>
> Hi Simone,
>
>
>
> Yes the MAC address in answers.conf: OVEHOSTED_VM/vmMACAddr=
>
> is added as a reservation to the DHCP server, so in theory 10.0.0.109
> should be assigned.
>
>
>
> However perhaps DHCP is not working. I have just changed to a static IP
> instead:
>
> OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24
>
> (let me know if this isn’t the correct way)
>
>
>
> My host fails to get an IP automatically from this DHCP server, so it is
> quite possible engine’s DHCP has been failing too. Each time the host
> boots, I must type dhclient in order to receive an IP address. Anyway,
> after changing this and re-running hosted-engine –deploy, failed due to:
>
>
>
> [ INFO ] TASK [Copy local VM disk to shared storage]
>
> [ INFO ] changed: [localhost]
>
> [ INFO ] TASK [show local_vm_ip.std_out_lines[0] that will be written to
> etc hosts]
>
> [ INFO ] ok: [localhost]
>
> [ INFO ] TASK [show FQDN]
>
> [ INFO ] ok: [localhost]
>
> [ INFO ] TASK [Clean /etc/hosts on the host]
>
> [ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an
> option with an undefined variable. The error was: list object has no
> element 0\n\nThe error appears to have been in
> '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line
> 400, column 5, but may\nbe elsewhere in the file depending on the exact
> syntax problem.\n\nThe offending line appears to be:\n\n debug:
> var=FQDN\n - name: Clean /etc/hosts on the host\n ^ here\n"}
>
>
>
> I have just tried deploying using the webui, same error. I suspect the
> “undefined variable” is local_vm_ip.std_out_lines[0]. My new debug task
> that tries to output this is:
>
> - name: show local_vm_ip.std_out_lines[0] that will be written to etc
> hosts
>
> debug: var=local_vm_ip.stdout_lines[0]
>
>
>
> You can see the output of this above. I think I was mistaken to suggest
> the value of this is localhost. Localhost is just the machine this task
> ran on. I don’t think list local_vm_ip.std_out_lines is defined. Any more
> ideas?
>
>
>
> The issue is on a task that isn't part of the code we are shipping.
>
> I can just suggest to simply reinstall the rpm to get rid of any
> modification and restart from scratch deploying with a static IP if your
> DHCP server is not properly working.
>
>
>
>
>
> Many thanks
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 09 October 2018 16:51
> *To:* B Holmes <***@brendanh.com>
> *Cc:* users <***@ovirt.org>
> *Subject:* Re: [ovirt-users] Re: Diary of hosted engine install woes
>
>
>
>
>
> On Tue, Oct 9, 2018 at 4:54 PM <***@brendanh.com> wrote:
>
> I'ved added a record to the DNS server here:
> ovirt-engine.example.com 10.0.0.109
>
>
>
> OK, and how the engine VM will get that address?
>
> Are you using DHCP? do you have a DHCP reservation for the MAC address you
> are using on the engine VM?
>
> Are you configuring it with a static IP?
>
>
>
>
> This IP address is on the physical network that the host is on (host is on
> 10.0.0.171). I trust this is correct and I should not resolve to a natted
> IP instead. I notice that regardless of this record, the name
> ovirt-engine.example.com resolves to a natted IP: 192.168.124.51 because
> the ansible script adds an entry to /etc/hosts:
> 192.168.124.51 ovirt-engine.example.com
> While the script is running, if I I can successfully ping
> ovirt-engine.example.com, it responds on 192.168.124.51. So as you say:
> "host can correctly resolve the name of the engine VM", but it's not the
> DNS record's IP. If I remove the DNS record and run hosted-engine
> --deploy, I get error:
> [ ERROR ] Host name is not valid: ovirt-engine.example.com did not
> resolve into an IP address
>
> Anyway, I added back the DNS record and ran hosted-engine --deploy
> command, it failed at:
> [ INFO ] TASK [Clean /etc/hosts on the host]
> [ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an
> option with an undefined variable. The error was: list object has no
> element 0\n\nThe error appears to have been in
> '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line
> 396, column 5, but may\nbe elsewhere in the file depending on the exact
> syntax problem.\n\nThe offending line appears to be:\n\n changed_when:
> True\n - name: Clean /etc/hosts on the host\n ^ here\n"}
>
> To debug, I added tasks to create_target_vm.yml that output the values of
> local_vm_ip.std_out_lines[0] and FQDN that are used in this task, then ran
> the usual deploy command again. They are both localhost:
> [ INFO ] TASK [show local_vm_ip.std_out_lines[0] that will be written to
> etc hosts]
> [ INFO ] ok: [localhost]
> [ INFO ] TASK [show FQDN]
> [ INFO ] ok: [localhost]
>
> This time, it gets past [Clean /etc/hosts on the host], but hangs at [
> INFO ] TASK [Check engine VM health] same as before.
>
>
>
> This is fine, the bootstrap local VM runs over a natted network then, once
> ready it will be shutdown and moved to the shared storage. At that point it
> will be restarted on your management network.
>
>
>
> I catted /etc/hosts while it was hanging and it contains:
> 127.0.0.1 localhost localhost.localdomain localhost4
> localhost4.localdomain4
> ::1 localhost localhost.localdomain localhost6
> localhost6.localdomain6
>
> The ovirt-engine.example.com has been deleted! I pinged
> ovirt-engine.example.com and it now resolves to its IP on the physical
> network: 10.0.0.109. So I added back this /etc/hosts entry:
> 192.168.124.51 ovirt-engine.example.com
>
>
>
> Please avoid this.
>
>
>
>
> It subsequently errored:
> [ ERROR ] fatal: [localhost]: FAILED! => {"attempts": 120, "changed":
> true, "cmd": ["hosted-engine", "--vm-status", "--json"], "delta":
> "0:00:00.167559", "end": "2018-10-09 15:43:41.947274", "rc": 0, "start":
> "2018-10-09 15:43:41.779715", "stderr": "", "stderr_lines": [], "stdout":
> "{\"1\": {\"conf_on_shared_storage\": true, \"live-data\": true, \"extra\":
> \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=6810
> (Tue Oct 9 15:43:36 2018)
> \\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=6810 (Tue Oct 9
> 15:43:37 2018)
> \\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\",
> \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\":
> \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\",
> \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\":
> false, \"crc32\": \"c5d76f8b\", \"local_conf_timestamp\": 6810,
> \"host-ts\": 6810}, \"global_maintenance\": false}", "stdout_lines":
> ["{\"1\": {\"conf_
> on_shared_storage\": true, \"live-data\": true, \"extra\":
> \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=6810
> (Tue Oct 9 15:43:36 2018)
> \\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=6810 (Tue Oct 9
> 15:43:37 2018)
> \\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\",
> \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\":
> \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\",
> \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\":
> false, \"crc32\": \"c5d76f8b\", \"local_conf_timestamp\": 6810,
> \"host-ts\": 6810}, \"global_maintenance\": false}"]}
>
> How can I check the hosted-engine's IP address to ensure name resolution
> is correct?
>
>
>
> You can connect to that VM with VNC and check the IP there.
>
>
>
> _______________________________________________
> Users mailing list -- ***@ovirt.org
> To unsubscribe send an email to users-***@ovirt.org
> Privacy Statement: https://www.ovirt.org/site/privacy-policy/
> oVirt Code of Conduct:
> https://www.ovirt.org/community/about/community-guidelines/
> List Archives:
> https://lists.ovirt.org/archives/list/***@ovirt.org/message/SVBXIBLS5TSP7SZROSSE6JD5ICBZLV3E/
>
>
Brendan Holmes
2018-10-21 21:50:11 UTC
Permalink
Hi Simone,



Sorry for late response, I’ve been unwell. Attached is the whole log you requested. I hope it reveals why variable “local_vm_ip.stdout_lines” is not being populated.



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com>
Sent: 15 October 2018 04:49



On Sun, Oct 14, 2018 at 6:08 AM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,

[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n"}



I think that this is just an error in the clean up on failures phase due to something bad before.

Can you please attach the whole deployment log from /var/log/ovirt-hosted-engine/hosted-engine.log ?
Simone Tiraboschi
2018-10-22 12:33:01 UTC
Permalink
On Sun, Oct 21, 2018 at 11:50 PM Brendan Holmes <***@brendanh.com> wrote:

> Hi Simone,
>
>
>
> Sorry for late response, I’ve been unwell. Attached is the whole log you
> requested. I hope it reveals why variable “local_vm_ip.stdout_lines” is
> not being populated.
>

Hi,
according to your log file the deployment was successful:
2018-10-14 13:06:36,316+0100 INFO otopi.plugins.gr_he_common.core.misc
misc._terminate:250 Hosted Engine successfully deployed

But from the logs I see that you introduced custom ansible tasks and you
also removed relevant code parts:
2018-10-14 13:04:31,778+0100 INFO
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:100 TASK [Copy local VM disk to shared
storage]
2018-10-14 13:04:59,019+0100 INFO
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:100 changed: [localhost]
2018-10-14 13:04:59,721+0100 INFO
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:100 TASK [pause]
2018-10-14 13:05:00,422+0100 INFO
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:100 ok: [localhost]
2018-10-14 13:05:01,124+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:94 TASK [debug]
2018-10-14 13:05:01,725+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:94 local_vm_ip: {'stderr_lines': [],
u'changed': True, u'end': u'2018-10-14 12:58:22.193727', u'stdout': u'',
u'cmd': u"virsh -r net-dhcp-leases default | grep -i 00:16:3e:31:d3:9e |
awk '{ print $5 }' | cut -f1 -d'/'", 'failed': False, u'delta':
u'0:00:00.069294', u'stderr': u'', u'rc': 0, 'stdout_lines': [], u'start':
u'2018-10-14 12:58:22.124433'}
2018-10-14 13:05:02,426+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:94 TASK [debug]
2018-10-14 13:05:03,127+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:94 local_vm_ip.stdout_lines[0]: VARIABLE IS
NOT DEFINED!
2018-10-14 13:05:03,828+0100 INFO
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:100 TASK [write value of
local_vm_ip.stdout_lines[0]]
2018-10-14 13:05:04,530+0100 INFO
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:100 ok: [localhost]
2018-10-14 13:05:05,231+0100 INFO
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:100 TASK [write value of FQDN]
2018-10-14 13:05:05,832+0100 INFO
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:100 ok: [localhost]
2018-10-14 13:05:06,434+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:180
ansible-playbook rc: 0
2018-10-14 13:05:06,434+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:94 PLAY RECAP [localhost] : ok: 89 changed:
24 unreachable: 0 skipped: 3 failed: 0
2018-10-14 13:05:06,434+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:94 PLAY RECAP [ovirt-engine.example.com] :
ok: 11 changed: 6 unreachable: 0 skipped: 0 failed: 0
2018-10-14 13:05:06,434+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:187
ansible-playbook stdout:
2018-10-14 13:05:06,434+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:189 [pause]

2018-10-14 13:05:06,434+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:189 Pausing
because next block will fail miserably:

2018-10-14 13:05:06,434+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:190
ansible-playbook stderr:
2018-10-14 13:05:06,435+0100 ERROR
otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:192
[WARNING]: Not waiting for response to prompt as stdin is not interactive


at this point I can only suggest again to simply run the vanilla code
without any custom patch.


>
> Many thanks,
>
> Brendan
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 15 October 2018 04:49
>
> On Sun, Oct 14, 2018 at 6:08 AM Brendan Holmes <***@brendanh.com> wrote:
>
> Hi Simone,
>
> [ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an
> option with an undefined variable. The error was: list object has no
> element 0\n\nThe error appears to have been in
> '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line
> 396, column 5, but may\nbe elsewhere in the file depending on the exact
> syntax problem.\n\nThe offending line appears to be:\n\n changed_when:
> True\n - name: Clean /etc/hosts on the host\n ^ here\n"}
>
>
>
> I think that this is just an error in the clean up on failures phase due
> to something bad before.
>
> Can you please attach the whole deployment log from
> /var/log/ovirt-hosted-engine/hosted-engine.log ?
>
>
>
Brendan Holmes
2018-10-22 13:19:36 UTC
Permalink
Hi Simone,



I had exactly the same error in the run prior, which is why I added simple non-function-changing debug code. I’ll send you the log from a run without this. It won’t be any different except won’t contain some extra info.



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com>
Sent: 22 October 2018 13:33
To: B Holmes <***@brendanh.com>
Cc: users <***@ovirt.org>
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes





On Sun, Oct 21, 2018 at 11:50 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



Sorry for late response, I’ve been unwell. Attached is the whole log you requested. I hope it reveals why variable “local_vm_ip.stdout_lines” is not being populated.



Hi,

according to your log file the deployment was successful:

2018-10-14 13:06:36,316+0100 INFO otopi.plugins.gr_he_common.core.misc misc._terminate:250 Hosted Engine successfully deployed



But from the logs I see that you introduced custom ansible tasks and you also removed relevant code parts:

2018-10-14 13:04:31,778+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Copy local VM disk to shared storage]

2018-10-14 13:04:59,019+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 changed: [localhost]

2018-10-14 13:04:59,721+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [pause]

2018-10-14 13:05:00,422+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]

2018-10-14 13:05:01,124+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 TASK [debug]

2018-10-14 13:05:01,725+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 local_vm_ip: {'stderr_lines': [], u'changed': True, u'end': u'2018-10-14 12:58:22.193727', u'stdout': u'', u'cmd': u"virsh -r net-dhcp-leases default | grep -i 00:16:3e:31:d3:9e | awk '{ print $5 }' | cut -f1 -d'/'", 'failed': False, u'delta': u'0:00:00.069294', u'stderr': u'', u'rc': 0, 'stdout_lines': [], u'start': u'2018-10-14 12:58:22.124433'}

2018-10-14 13:05:02,426+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 TASK [debug]

2018-10-14 13:05:03,127+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 local_vm_ip.stdout_lines[0]: VARIABLE IS NOT DEFINED!

2018-10-14 13:05:03,828+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [write value of local_vm_ip.stdout_lines[0]]

2018-10-14 13:05:04,530+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]

2018-10-14 13:05:05,231+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [write value of FQDN]

2018-10-14 13:05:05,832+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 ok: [localhost]

2018-10-14 13:05:06,434+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:180 ansible-playbook rc: 0

2018-10-14 13:05:06,434+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 PLAY RECAP [localhost] : ok: 89 changed: 24 unreachable: 0 skipped: 3 failed: 0

2018-10-14 13:05:06,434+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 PLAY RECAP [ovirt-engine.example.com <http://ovirt-engine.example.com> ] : ok: 11 changed: 6 unreachable: 0 skipped: 0 failed: 0

2018-10-14 13:05:06,434+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:187 ansible-playbook stdout:

2018-10-14 13:05:06,434+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:189 [pause]



2018-10-14 13:05:06,434+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:189 Pausing because next block will fail miserably:



2018-10-14 13:05:06,434+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:190 ansible-playbook stderr:

2018-10-14 13:05:06,435+0100 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:192 [WARNING]: Not waiting for response to prompt as stdin is not interactive





at this point I can only suggest again to simply run the vanilla code without any custom patch.





Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 15 October 2018 04:49

On Sun, Oct 14, 2018 at 6:08 AM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,

[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n"}



I think that this is just an error in the clean up on failures phase due to something bad before.

Can you please attach the whole deployment log from /var/log/ovirt-hosted-engine/hosted-engine.log ?
Brendan Holmes
2018-10-22 20:37:18 UTC
Permalink
Hi Simone,



Pls see attached the log without any of my variable-enumeration changes.



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com>
Sent: 22 October 2018 13:33
To: B Holmes <***@brendanh.com>
Cc: users <***@ovirt.org>
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes





On Sun, Oct 21, 2018 at 11:50 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



Sorry for late response, I’ve been unwell. Attached is the whole log you requested. I hope it reveals why variable “local_vm_ip.stdout_lines” is not being populated.



Hi,

according to your log file the deployment was successful:



at this point I can only suggest again to simply run the vanilla code without any custom patch.



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 15 October 2018 04:49

On Sun, Oct 14, 2018 at 6:08 AM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,

[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n"}



I think that this is just an error in the clean up on failures phase due to something bad before.

Can you please attach the whole deployment log from /var/log/ovirt-hosted-engine/hosted-engine.log ?
Simone Tiraboschi
2018-10-23 08:08:46 UTC
Permalink
OK,
according to this log file, the target engine VM didn't got an IPv4 address:

2018-10-22 21:22:23,360+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:94 engine_vm_ip_vdsm:
2018-10-22 21:22:24,062+0100 INFO
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:100 TASK [Fail if the Engine has no IP
address]
2018-10-22 21:22:24,763+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:94 {u'msg': u'Engine VM has no IP address.
Please check your network configuration', u'changed': False,
u'_ansible_no_log': False}
2018-10-22 21:22:24,863+0100 ERROR
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:98 fatal: [localhost]: FAILED! => {"changed":
false, "msg": "Engine VM has no IP address. Please check your network
configuration"}
2018-10-22 21:22:25,365+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:180
ansible-playbook rc: 2
2018-10-22 21:22:25,365+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:94 PLAY RECAP [localhost] : ok: 82 changed:
31 unreachable: 0 skipped: 5 failed: 2
2018-10-22 21:22:25,365+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:94 PLAY RECAP [ovirt-engine.example.com] :
ok: 11 changed: 6 unreachable: 0 skipped: 0 failed: 0
2018-10-22 21:22:25,365+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:187
ansible-playbook stdout:
2018-10-22 21:22:25,365+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:189 to
retry, use: --limit
@/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.retry

2018-10-22 21:22:25,365+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:190
ansible-playbook stderr:
2018-10-22 21:22:25,366+0100 DEBUG otopi.context context._executeMethod:143
method exception
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/otopi/context.py", line 133, in
_executeMethod
method['method']()
File
"/usr/share/ovirt-hosted-engine-setup/scripts/../plugins/gr-he-ansiblesetup/core/target_vm.py",
line 214, in _closeup
r = ah.run()
File
"/usr/lib/python2.7/site-packages/ovirt_hosted_engine_setup/ansible_utils.py",
line 194, in run
raise RuntimeError(_('Failed executing ansible-playbook'))
RuntimeError: Failed executing ansible-playbook
2018-10-22 21:22:25,367+0100 ERROR otopi.context context._executeMethod:152
Failed to execute stage 'Closing up': Failed executing ansible-playbook


Are you sure that your DHCP server is properly working?
You can also connect to that VM with VNC to check what's going on.

On Mon, Oct 22, 2018 at 10:37 PM Brendan Holmes <***@brendanh.com> wrote:

> Hi Simone,
>
>
>
> Pls see attached the log without any of my variable-enumeration changes.
>
>
>
> Many thanks,
>
> Brendan
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 22 October 2018 13:33
> *To:* B Holmes <***@brendanh.com>
> *Cc:* users <***@ovirt.org>
> *Subject:* Re: [ovirt-users] Re: Diary of hosted engine install woes
>
>
>
>
>
> On Sun, Oct 21, 2018 at 11:50 PM Brendan Holmes <***@brendanh.com> wrote:
>
> Hi Simone,
>
>
>
> Sorry for late response, I’ve been unwell. Attached is the whole log you
> requested. I hope it reveals why variable “local_vm_ip.stdout_lines” is
> not being populated.
>
>
>
> Hi,
>
> according to your log file the deployment was successful:
>
>
>
> at this point I can only suggest again to simply run the vanilla code
> without any custom patch.
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 15 October 2018 04:49
>
> On Sun, Oct 14, 2018 at 6:08 AM Brendan Holmes <***@brendanh.com> wrote:
>
> Hi Simone,
>
> [ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an
> option with an undefined variable. The error was: list object has no
> element 0\n\nThe error appears to have been in
> '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line
> 396, column 5, but may\nbe elsewhere in the file depending on the exact
> syntax problem.\n\nThe offending line appears to be:\n\n changed_when:
> True\n - name: Clean /etc/hosts on the host\n ^ here\n"}
>
>
>
> I think that this is just an error in the clean up on failures phase due
> to something bad before.
>
> Can you please attach the whole deployment log from
> /var/log/ovirt-hosted-engine/hosted-engine.log ?
>
>
>
>
Brendan Holmes
2018-10-23 13:42:37 UTC
Permalink
Hi Simone,



I accidentally ran this using DHCP, which I know is problematic on my network. I had previously run it using answer file parameter:

OVEHOSTED_VM/cloudinitVMStaticCIDR=str:<static_IP_address>



The same error occurred last time, but I’ll do so again and resend the full log.



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com>
Sent: 23 October 2018 09:09
To: B Holmes <***@brendanh.com>
Cc: users <***@ovirt.org>
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes



OK,

according to this log file, the target engine VM didn't got an IPv4 address:



2018-10-22 21:22:23,360+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 engine_vm_ip_vdsm:

2018-10-22 21:22:24,062+0100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:100 TASK [Fail if the Engine has no IP address]

2018-10-22 21:22:24,763+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 {u'msg': u'Engine VM has no IP address. Please check your network configuration', u'changed': False, u'_ansible_no_log': False}

2018-10-22 21:22:24,863+0100 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:98 fatal: [localhost]: FAILED! => {"changed": false, "msg": "Engine VM has no IP address. Please check your network configuration"}

2018-10-22 21:22:25,365+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:180 ansible-playbook rc: 2

2018-10-22 21:22:25,365+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 PLAY RECAP [localhost] : ok: 82 changed: 31 unreachable: 0 skipped: 5 failed: 2

2018-10-22 21:22:25,365+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 PLAY RECAP [ovirt-engine.example.com <http://ovirt-engine.example.com> ] : ok: 11 changed: 6 unreachable: 0 skipped: 0 failed: 0

2018-10-22 21:22:25,365+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:187 ansible-playbook stdout:

2018-10-22 21:22:25,365+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:189 to retry, use: --limit @/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.retry



2018-10-22 21:22:25,365+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:190 ansible-playbook stderr:

2018-10-22 21:22:25,366+0100 DEBUG otopi.context context._executeMethod:143 method exception

Traceback (most recent call last):

File "/usr/lib/python2.7/site-packages/otopi/context.py", line 133, in _executeMethod

method['method']()

File "/usr/share/ovirt-hosted-engine-setup/scripts/../plugins/gr-he-ansiblesetup/core/target_vm.py", line 214, in _closeup

r = ah.run()

File "/usr/lib/python2.7/site-packages/ovirt_hosted_engine_setup/ansible_utils.py", line 194, in run

raise RuntimeError(_('Failed executing ansible-playbook'))

RuntimeError: Failed executing ansible-playbook

2018-10-22 21:22:25,367+0100 ERROR otopi.context context._executeMethod:152 Failed to execute stage 'Closing up': Failed executing ansible-playbook





Are you sure that your DHCP server is properly working?

You can also connect to that VM with VNC to check what's going on.



On Mon, Oct 22, 2018 at 10:37 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



Pls see attached the log without any of my variable-enumeration changes.



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 22 October 2018 13:33
To: B Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Cc: users <***@ovirt.org <mailto:***@ovirt.org> >
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes





On Sun, Oct 21, 2018 at 11:50 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



Sorry for late response, I’ve been unwell. Attached is the whole log you requested. I hope it reveals why variable “local_vm_ip.stdout_lines” is not being populated.



Hi,

according to your log file the deployment was successful:



at this point I can only suggest again to simply run the vanilla code without any custom patch.



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 15 October 2018 04:49

On Sun, Oct 14, 2018 at 6:08 AM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,

[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n"}



I think that this is just an error in the clean up on failures phase due to something bad before.

Can you please attach the whole deployment log from /var/log/ovirt-hosted-engine/hosted-engine.log ?
Brendan Holmes
2018-10-23 21:19:30 UTC
Permalink
Hi Simone,



I attempted to use a fixed IP address instead of DHCP. I attach an attempt using answer-file parameter:

OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24

which is in accordance with this example on your website:

https://ovirt.org/develop/release-management/features/integration/heapplianceflow/



However, instead of this, you will notice that the log has:

OVEHOSTED_VM/cloudinitVMStaticCIDR=bool:'False'



The same error as usual occurred. Is this:

1. The wrong syntax\format for specifying a “cloudinitVMStaticCIDR” IP?
2. ovirt setup is ignoring it: either failing to assign the IP to the engine VM or to variable “local_vm_ip.stdout_lines”?
3. Something else?



“You can also connect to that VM with VNC to check what's going on.” – Doesn’t work. Port querying 10.0.0.109 on 5800\5900 fails, so I don’t think the VM is networked. Which port does engine VNC use?



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com>
Sent: 23 October 2018 09:09
To: B Holmes <***@brendanh.com>
Cc: users <***@ovirt.org>
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes



OK,

according to this log file, the target engine VM didn't got an IPv4 address:







Are you sure that your DHCP server is properly working?

You can also connect to that VM with VNC to check what's going on.



On Mon, Oct 22, 2018 at 10:37 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



Pls see attached the log without any of my variable-enumeration changes.



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 22 October 2018 13:33
To: B Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Cc: users <***@ovirt.org <mailto:***@ovirt.org> >
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes





On Sun, Oct 21, 2018 at 11:50 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



Sorry for late response, I’ve been unwell. Attached is the whole log you requested. I hope it reveals why variable “local_vm_ip.stdout_lines” is not being populated.



Hi,

according to your log file the deployment was successful:



at this point I can only suggest again to simply run the vanilla code without any custom patch.



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 15 October 2018 04:49

On Sun, Oct 14, 2018 at 6:08 AM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,

[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n"}



I think that this is just an error in the clean up on failures phase due to something bad before.

Can you please attach the whole deployment log from /var/log/ovirt-hosted-engine/hosted-engine.log ?
Simone Tiraboschi
2018-10-24 07:44:19 UTC
Permalink
Can you please retry installing
https://jenkins.ovirt.org/job/ovirt-hosted-engine-setup_4.2_build-artifacts-el7-x86_64/157/artifact/exported-artifacts/ovirt-hosted-engine-setup-2.2.29-1.el7.noarch.rpm
and using it interactively without manually filing an answerfile?

On Tue, Oct 23, 2018 at 11:19 PM Brendan Holmes <***@brendanh.com> wrote:

> Hi Simone,
>
>
>
> I attempted to use a fixed IP address instead of DHCP. I attach an
> attempt using answer-file parameter:
>
> OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24
>
> which is in accordance with this example on your website:
>
>
> https://ovirt.org/develop/release-management/features/integration/heapplianceflow/
>
>
>
> However, instead of this, you will notice that the log has:
>
> OVEHOSTED_VM/cloudinitVMStaticCIDR=bool:'False'
>
>
>
> The same error as usual occurred. Is this:
>
> 1. The wrong syntax\format for specifying a “cloudinitVMStaticCIDR” IP?
> 2. ovirt setup is ignoring it: either failing to assign the IP to the
> engine VM or to variable “local_vm_ip.stdout_lines”?
> 3. Something else?
>
>
>
> “You can also connect to that VM with VNC to check what's going on.” –
> Doesn’t work. Port querying 10.0.0.109 on 5800\5900 fails, so I don’t
> think the VM is networked. Which port does engine VNC use?
>
>
>
> Many thanks,
>
> Brendan
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 23 October 2018 09:09
> *To:* B Holmes <***@brendanh.com>
> *Cc:* users <***@ovirt.org>
> *Subject:* Re: [ovirt-users] Re: Diary of hosted engine install woes
>
>
>
> OK,
>
> according to this log file, the target engine VM didn't got an IPv4
> address:
>
>
>
>
>
>
>
> Are you sure that your DHCP server is properly working?
>
> You can also connect to that VM with VNC to check what's going on.
>
>
>
> On Mon, Oct 22, 2018 at 10:37 PM Brendan Holmes <***@brendanh.com> wrote:
>
> Hi Simone,
>
>
>
> Pls see attached the log without any of my variable-enumeration changes.
>
>
>
> Many thanks,
>
> Brendan
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 22 October 2018 13:33
> *To:* B Holmes <***@brendanh.com>
> *Cc:* users <***@ovirt.org>
> *Subject:* Re: [ovirt-users] Re: Diary of hosted engine install woes
>
>
>
>
>
> On Sun, Oct 21, 2018 at 11:50 PM Brendan Holmes <***@brendanh.com> wrote:
>
> Hi Simone,
>
>
>
> Sorry for late response, I’ve been unwell. Attached is the whole log you
> requested. I hope it reveals why variable “local_vm_ip.stdout_lines” is
> not being populated.
>
>
>
> Hi,
>
> according to your log file the deployment was successful:
>
>
>
> at this point I can only suggest again to simply run the vanilla code
> without any custom patch.
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 15 October 2018 04:49
>
> On Sun, Oct 14, 2018 at 6:08 AM Brendan Holmes <***@brendanh.com> wrote:
>
> Hi Simone,
>
> [ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an
> option with an undefined variable. The error was: list object has no
> element 0\n\nThe error appears to have been in
> '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line
> 396, column 5, but may\nbe elsewhere in the file depending on the exact
> syntax problem.\n\nThe offending line appears to be:\n\n changed_when:
> True\n - name: Clean /etc/hosts on the host\n ^ here\n"}
>
>
>
> I think that this is just an error in the clean up on failures phase due
> to something bad before.
>
> Can you please attach the whole deployment log from
> /var/log/ovirt-hosted-engine/hosted-engine.log ?
>
>
>
>
Brendan Holmes
2018-10-24 14:56:01 UTC
Permalink
Hi Simone,



My attempt to install the RPM output hundreds of lines similar to:

/usr/lib/python2.7/site-packages/ovirt_hosted_engine_setup/__init__.pyc from install of ovirt-hosted-engine-setup-2.2.29-1.el7.noarch conflicts with file from package ovirt-hosted-engine-setup-2.2.26-1.el7.noarch



Do I need to install the matching version of ovirt node?



“using it interactively without manually filing an answerfile” – if you mean using command: hosted-engine –deploy

and then manually typing each value, I tried this (choosing the static IP option) and the usual error occurs. I attach the log from this attempt.



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com>
Sent: 24 October 2018 08:44
To: B Holmes <***@brendanh.com>
Cc: users <***@ovirt.org>
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes



Can you please retry installing https://jenkins.ovirt.org/job/ovirt-hosted-engine-setup_4.2_build-artifacts-el7-x86_64/157/artifact/exported-artifacts/ovirt-hosted-engine-setup-2.2.29-1.el7.noarch.rpm and using it interactively without manually filing an answerfile?

On Tue, Oct 23, 2018 at 11:19 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



I attempted to use a fixed IP address instead of DHCP. I attach an attempt using answer-file parameter:

OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24 <http://10.0.0.109/24>

which is in accordance with this example on your website:

https://ovirt.org/develop/release-management/features/integration/heapplianceflow/



However, instead of this, you will notice that the log has:

OVEHOSTED_VM/cloudinitVMStaticCIDR=bool:'False'



The same error as usual occurred. Is this:

1. The wrong syntax\format for specifying a “cloudinitVMStaticCIDR” IP?
2. ovirt setup is ignoring it: either failing to assign the IP to the engine VM or to variable “local_vm_ip.stdout_lines”?
3. Something else?



“You can also connect to that VM with VNC to check what's going on.” – Doesn’t work. Port querying 10.0.0.109 on 5800\5900 fails, so I don’t think the VM is networked. Which port does engine VNC use?



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 23 October 2018 09:09
To: B Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Cc: users <***@ovirt.org <mailto:***@ovirt.org> >
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes



OK,

according to this log file, the target engine VM didn't got an IPv4 address:







Are you sure that your DHCP server is properly working?

You can also connect to that VM with VNC to check what's going on.



On Mon, Oct 22, 2018 at 10:37 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



Pls see attached the log without any of my variable-enumeration changes.



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 22 October 2018 13:33
To: B Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Cc: users <***@ovirt.org <mailto:***@ovirt.org> >
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes





On Sun, Oct 21, 2018 at 11:50 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



Sorry for late response, I’ve been unwell. Attached is the whole log you requested. I hope it reveals why variable “local_vm_ip.stdout_lines” is not being populated.



Hi,

according to your log file the deployment was successful:



at this point I can only suggest again to simply run the vanilla code without any custom patch.



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 15 October 2018 04:49

On Sun, Oct 14, 2018 at 6:08 AM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,

[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n"}



I think that this is just an error in the clean up on failures phase due to something bad before.

Can you please attach the whole deployment log from /var/log/ovirt-hosted-engine/hosted-engine.log ?
Simone Tiraboschi
2018-10-24 15:10:56 UTC
Permalink
On Wed, Oct 24, 2018 at 4:56 PM Brendan Holmes <***@brendanh.com> wrote:

> Hi Simone,
>
>
>
> My attempt to install the RPM output hundreds of lines similar to:
>
> /usr/lib/python2.7/site-packages/ovirt_hosted_engine_setup/__init__.pyc
> from install of ovirt-hosted-engine-setup-2.2.29-1.el7.noarch conflicts
> with file from package ovirt-hosted-engine-setup-2.2.26-1.el7.noarch
>
>
>
> Do I need to install the matching version of ovirt node?
>
>
>
> “using it interactively without manually filing an answerfile” – if you
> mean using command: hosted-engine –deploy
>
> and then manually typing each value, I tried this (choosing the static IP
> option) and the usual error occurs. I attach the log from this attempt.
>

The issue here is that it seams that your VM never got an address from
libvirt default network DHCP:
2018-10-24 15:36:14,370+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:94 local_vm_ip: {'stderr_lines': [],
u'changed': True, u'end': u'2018-10-24 15:36:13.196641', u'stdout': u'',
u'cmd': u"virsh -r net-dhcp-leases default | grep -i 00:16:3e:3d:13:d8 |
awk '{ print $5 }' | cut -f1 -d'/'", 'failed': False, u'delta':
u'0:00:00.054641', u'stderr': u'', u'rc': 0, 'stdout_lines': [], u'start':
u'2018-10-24 15:36:13.142000'}

Can you please share the output of
virsh -r net-dhcp-leases default

Are you running nested on a different hypervisor?


>
>
> Many thanks,
>
> Brendan
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 24 October 2018 08:44
> *To:* B Holmes <***@brendanh.com>
> *Cc:* users <***@ovirt.org>
> *Subject:* Re: [ovirt-users] Re: Diary of hosted engine install woes
>
>
>
> Can you please retry installing
> https://jenkins.ovirt.org/job/ovirt-hosted-engine-setup_4.2_build-artifacts-el7-x86_64/157/artifact/exported-artifacts/ovirt-hosted-engine-setup-2.2.29-1.el7.noarch.rpm
> and using it interactively without manually filing an answerfile?
>
> On Tue, Oct 23, 2018 at 11:19 PM Brendan Holmes <***@brendanh.com> wrote:
>
> Hi Simone,
>
>
>
> I attempted to use a fixed IP address instead of DHCP. I attach an
> attempt using answer-file parameter:
>
> OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24
>
> which is in accordance with this example on your website:
>
>
> https://ovirt.org/develop/release-management/features/integration/heapplianceflow/
>
>
>
> However, instead of this, you will notice that the log has:
>
> OVEHOSTED_VM/cloudinitVMStaticCIDR=bool:'False'
>
>
>
> The same error as usual occurred. Is this:
>
> 1. The wrong syntax\format for specifying a “cloudinitVMStaticCIDR” IP?
> 2. ovirt setup is ignoring it: either failing to assign the IP to the
> engine VM or to variable “local_vm_ip.stdout_lines”?
> 3. Something else?
>
>
>
> “You can also connect to that VM with VNC to check what's going on.” –
> Doesn’t work. Port querying 10.0.0.109 on 5800\5900 fails, so I don’t
> think the VM is networked. Which port does engine VNC use?
>
>
>
> Many thanks,
>
> Brendan
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 23 October 2018 09:09
> *To:* B Holmes <***@brendanh.com>
> *Cc:* users <***@ovirt.org>
> *Subject:* Re: [ovirt-users] Re: Diary of hosted engine install woes
>
>
>
> OK,
>
> according to this log file, the target engine VM didn't got an IPv4
> address:
>
>
>
>
>
>
>
> Are you sure that your DHCP server is properly working?
>
> You can also connect to that VM with VNC to check what's going on.
>
>
>
> On Mon, Oct 22, 2018 at 10:37 PM Brendan Holmes <***@brendanh.com> wrote:
>
> Hi Simone,
>
>
>
> Pls see attached the log without any of my variable-enumeration changes.
>
>
>
> Many thanks,
>
> Brendan
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 22 October 2018 13:33
> *To:* B Holmes <***@brendanh.com>
> *Cc:* users <***@ovirt.org>
> *Subject:* Re: [ovirt-users] Re: Diary of hosted engine install woes
>
>
>
>
>
> On Sun, Oct 21, 2018 at 11:50 PM Brendan Holmes <***@brendanh.com> wrote:
>
> Hi Simone,
>
>
>
> Sorry for late response, I’ve been unwell. Attached is the whole log you
> requested. I hope it reveals why variable “local_vm_ip.stdout_lines” is
> not being populated.
>
>
>
> Hi,
>
> according to your log file the deployment was successful:
>
>
>
> at this point I can only suggest again to simply run the vanilla code
> without any custom patch.
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 15 October 2018 04:49
>
> On Sun, Oct 14, 2018 at 6:08 AM Brendan Holmes <***@brendanh.com> wrote:
>
> Hi Simone,
>
> [ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an
> option with an undefined variable. The error was: list object has no
> element 0\n\nThe error appears to have been in
> '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line
> 396, column 5, but may\nbe elsewhere in the file depending on the exact
> syntax problem.\n\nThe offending line appears to be:\n\n changed_when:
> True\n - name: Clean /etc/hosts on the host\n ^ here\n"}
>
>
>
> I think that this is just an error in the clean up on failures phase due
> to something bad before.
>
> Can you please attach the whole deployment log from
> /var/log/ovirt-hosted-engine/hosted-engine.log ?
>
>
>
>
Brendan Holmes
2018-10-24 15:48:08 UTC
Permalink
Hi Simone,



[***@host ~]# virsh -r net-dhcp-leases default

Expiry Time MAC address Protocol IP address Hostname Client ID or DUID

-------------------------------------------------------------------------------------------------------------------



[***@host ~]#



No, node is running on bare-metal. Node was installed by plugging in a bootable USB.



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com>
Sent: 24 October 2018 16:11
To: B Holmes <***@brendanh.com>
Cc: users <***@ovirt.org>
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes





On Wed, Oct 24, 2018 at 4:56 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



My attempt to install the RPM output hundreds of lines similar to:

/usr/lib/python2.7/site-packages/ovirt_hosted_engine_setup/__init__.pyc from install of ovirt-hosted-engine-setup-2.2.29-1.el7.noarch conflicts with file from package ovirt-hosted-engine-setup-2.2.26-1.el7.noarch



Do I need to install the matching version of ovirt node?



“using it interactively without manually filing an answerfile” – if you mean using command: hosted-engine –deploy

and then manually typing each value, I tried this (choosing the static IP option) and the usual error occurs. I attach the log from this attempt.



The issue here is that it seams that your VM never got an address from libvirt default network DHCP:

2018-10-24 15:36:14,370+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 local_vm_ip: {'stderr_lines': [], u'changed': True, u'end': u'2018-10-24 15:36:13.196641', u'stdout': u'', u'cmd': u"virsh -r net-dhcp-leases default | grep -i 00:16:3e:3d:13:d8 | awk '{ print $5 }' | cut -f1 -d'/'", 'failed': False, u'delta': u'0:00:00.054641', u'stderr': u'', u'rc': 0, 'stdout_lines': [], u'start': u'2018-10-24 15:36:13.142000'}



Can you please share the output of

virsh -r net-dhcp-leases default



Are you running nested on a different hypervisor?





Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 24 October 2018 08:44
To: B Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Cc: users <***@ovirt.org <mailto:***@ovirt.org> >
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes



Can you please retry installing https://jenkins.ovirt.org/job/ovirt-hosted-engine-setup_4.2_build-artifacts-el7-x86_64/157/artifact/exported-artifacts/ovirt-hosted-engine-setup-2.2.29-1.el7.noarch.rpm and using it interactively without manually filing an answerfile?

On Tue, Oct 23, 2018 at 11:19 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



I attempted to use a fixed IP address instead of DHCP. I attach an attempt using answer-file parameter:

OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24 <http://10.0.0.109/24>

which is in accordance with this example on your website:

https://ovirt.org/develop/release-management/features/integration/heapplianceflow/



However, instead of this, you will notice that the log has:

OVEHOSTED_VM/cloudinitVMStaticCIDR=bool:'False'



The same error as usual occurred. Is this:

1. The wrong syntax\format for specifying a “cloudinitVMStaticCIDR” IP?
2. ovirt setup is ignoring it: either failing to assign the IP to the engine VM or to variable “local_vm_ip.stdout_lines”?
3. Something else?



“You can also connect to that VM with VNC to check what's going on.” – Doesn’t work. Port querying 10.0.0.109 on 5800\5900 fails, so I don’t think the VM is networked. Which port does engine VNC use?



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 23 October 2018 09:09
To: B Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Cc: users <***@ovirt.org <mailto:***@ovirt.org> >
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes



OK,

according to this log file, the target engine VM didn't got an IPv4 address:







Are you sure that your DHCP server is properly working?

You can also connect to that VM with VNC to check what's going on.



On Mon, Oct 22, 2018 at 10:37 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



Pls see attached the log without any of my variable-enumeration changes.



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 22 October 2018 13:33
To: B Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Cc: users <***@ovirt.org <mailto:***@ovirt.org> >
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes





On Sun, Oct 21, 2018 at 11:50 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



Sorry for late response, I’ve been unwell. Attached is the whole log you requested. I hope it reveals why variable “local_vm_ip.stdout_lines” is not being populated.



Hi,

according to your log file the deployment was successful:



at this point I can only suggest again to simply run the vanilla code without any custom patch.



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 15 October 2018 04:49

On Sun, Oct 14, 2018 at 6:08 AM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,

[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n"}



I think that this is just an error in the clean up on failures phase due to something bad before.

Can you please attach the whole deployment log from /var/log/ovirt-hosted-engine/hosted-engine.log ?
Brendan Holmes
2018-10-25 14:28:19 UTC
Permalink
“The issue here is that it seams that your VM never got an address from libvirt default network DHCP” – but aren’t I obviating DHCP by using parameter:

OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24

? VM should receive a static IP instead.



Perhaps there is a bug in hosted-engine setup that forces it to require DHCP. If you have no more suggestions, I will buy a new router with different DHCP server.



Many thanks,

Brendan



From: Brendan Holmes <***@brendanh.com>
Sent: 24 October 2018 16:48
To: 'Simone Tiraboschi' <***@redhat.com>
Cc: 'users' <***@ovirt.org>
Subject: RE: [ovirt-users] Re: Diary of hosted engine install woes



Hi Simone,



[***@host ~]# virsh -r net-dhcp-leases default

Expiry Time MAC address Protocol IP address Hostname Client ID or DUID

-------------------------------------------------------------------------------------------------------------------



[***@host ~]#



No, node is running on bare-metal. Node was installed by plugging in a bootable USB.



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 24 October 2018 16:11
To: B Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Cc: users <***@ovirt.org <mailto:***@ovirt.org> >
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes





On Wed, Oct 24, 2018 at 4:56 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



My attempt to install the RPM output hundreds of lines similar to:

/usr/lib/python2.7/site-packages/ovirt_hosted_engine_setup/__init__.pyc from install of ovirt-hosted-engine-setup-2.2.29-1.el7.noarch conflicts with file from package ovirt-hosted-engine-setup-2.2.26-1.el7.noarch



Do I need to install the matching version of ovirt node?



“using it interactively without manually filing an answerfile” – if you mean using command: hosted-engine –deploy

and then manually typing each value, I tried this (choosing the static IP option) and the usual error occurs. I attach the log from this attempt.



The issue here is that it seams that your VM never got an address from libvirt default network DHCP:

2018-10-24 15:36:14,370+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 local_vm_ip: {'stderr_lines': [], u'changed': True, u'end': u'2018-10-24 15:36:13.196641', u'stdout': u'', u'cmd': u"virsh -r net-dhcp-leases default | grep -i 00:16:3e:3d:13:d8 | awk '{ print $5 }' | cut -f1 -d'/'", 'failed': False, u'delta': u'0:00:00.054641', u'stderr': u'', u'rc': 0, 'stdout_lines': [], u'start': u'2018-10-24 15:36:13.142000'}



Can you please share the output of

virsh -r net-dhcp-leases default



Are you running nested on a different hypervisor?





Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 24 October 2018 08:44
To: B Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Cc: users <***@ovirt.org <mailto:***@ovirt.org> >
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes



Can you please retry installing https://jenkins.ovirt.org/job/ovirt-hosted-engine-setup_4.2_build-artifacts-el7-x86_64/157/artifact/exported-artifacts/ovirt-hosted-engine-setup-2.2.29-1.el7.noarch.rpm and using it interactively without manually filing an answerfile?

On Tue, Oct 23, 2018 at 11:19 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



I attempted to use a fixed IP address instead of DHCP. I attach an attempt using answer-file parameter:

OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24 <http://10.0.0.109/24>

which is in accordance with this example on your website:

https://ovirt.org/develop/release-management/features/integration/heapplianceflow/



However, instead of this, you will notice that the log has:

OVEHOSTED_VM/cloudinitVMStaticCIDR=bool:'False'



The same error as usual occurred. Is this:

1. The wrong syntax\format for specifying a “cloudinitVMStaticCIDR” IP?
2. ovirt setup is ignoring it: either failing to assign the IP to the engine VM or to variable “local_vm_ip.stdout_lines”?
3. Something else?



“You can also connect to that VM with VNC to check what's going on.” – Doesn’t work. Port querying 10.0.0.109 on 5800\5900 fails, so I don’t think the VM is networked. Which port does engine VNC use?



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 23 October 2018 09:09
To: B Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Cc: users <***@ovirt.org <mailto:***@ovirt.org> >
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes



OK,

according to this log file, the target engine VM didn't got an IPv4 address:







Are you sure that your DHCP server is properly working?

You can also connect to that VM with VNC to check what's going on.



On Mon, Oct 22, 2018 at 10:37 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



Pls see attached the log without any of my variable-enumeration changes.



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 22 October 2018 13:33
To: B Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Cc: users <***@ovirt.org <mailto:***@ovirt.org> >
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes





On Sun, Oct 21, 2018 at 11:50 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



Sorry for late response, I’ve been unwell. Attached is the whole log you requested. I hope it reveals why variable “local_vm_ip.stdout_lines” is not being populated.



Hi,

according to your log file the deployment was successful:



at this point I can only suggest again to simply run the vanilla code without any custom patch.



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 15 October 2018 04:49

On Sun, Oct 14, 2018 at 6:08 AM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,

[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n"}



I think that this is just an error in the clean up on failures phase due to something bad before.

Can you please attach the whole deployment log from /var/log/ovirt-hosted-engine/hosted-engine.log ?
Simone Tiraboschi
2018-10-26 07:39:20 UTC
Permalink
On Thu, Oct 25, 2018 at 4:29 PM Brendan Holmes <***@brendanh.com> wrote:

> “The issue here is that it seams that your VM never got an address from
> libvirt default network DHCP” – but aren’t I obviating DHCP by using
> parameter:
>
> OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24
>
> ? VM should receive a static IP instead.
>
>
>
> Perhaps there is a bug in hosted-engine setup that forces it to require
> DHCP.
>

No, this is expected.
The deploy process will launch a bootstrap local VM that will get a
temporary address via DHCP over libvirt default natted network.
The engine running on that VM will be used to configure the environment and
create a VM on the shared storage.
Only at the end, the local VM will be shutdown and its disk moved over the
disk of the VM created by the engine on the shared storage.

In your case it seams that the bootstrap VMs for some reasons fails to get
an address from DHCP over libvirt default natted network.
I'd suggest to try connecting to that VM via VNC to see what's going on
there.

If you have no more suggestions, I will buy a new router with different
> DHCP server.
>
>
>
> Many thanks,
>
> Brendan
>
>
>
> *From:* Brendan Holmes <***@brendanh.com>
> *Sent:* 24 October 2018 16:48
> *To:* 'Simone Tiraboschi' <***@redhat.com>
> *Cc:* 'users' <***@ovirt.org>
> *Subject:* RE: [ovirt-users] Re: Diary of hosted engine install woes
>
>
>
> Hi Simone,
>
>
>
> [***@host ~]# virsh -r net-dhcp-leases default
>
> Expiry Time MAC address Protocol IP
> address Hostname Client ID or DUID
>
>
> -------------------------------------------------------------------------------------------------------------------
>
>
>
> [***@host ~]#
>
>
>
> No, node is running on bare-metal. Node was installed by plugging in a
> bootable USB.
>
>
>
> Many thanks,
>
> Brendan
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 24 October 2018 16:11
> *To:* B Holmes <***@brendanh.com>
> *Cc:* users <***@ovirt.org>
> *Subject:* Re: [ovirt-users] Re: Diary of hosted engine install woes
>
>
>
>
>
> On Wed, Oct 24, 2018 at 4:56 PM Brendan Holmes <***@brendanh.com> wrote:
>
> Hi Simone,
>
>
>
> My attempt to install the RPM output hundreds of lines similar to:
>
> /usr/lib/python2.7/site-packages/ovirt_hosted_engine_setup/__init__.pyc
> from install of ovirt-hosted-engine-setup-2.2.29-1.el7.noarch conflicts
> with file from package ovirt-hosted-engine-setup-2.2.26-1.el7.noarch
>
>
>
> Do I need to install the matching version of ovirt node?
>
>
>
> “using it interactively without manually filing an answerfile” – if you
> mean using command: hosted-engine –deploy
>
> and then manually typing each value, I tried this (choosing the static IP
> option) and the usual error occurs. I attach the log from this attempt.
>
>
>
> The issue here is that it seams that your VM never got an address from
> libvirt default network DHCP:
>
> 2018-10-24 15:36:14,370+0100 DEBUG
> otopi.ovirt_hosted_engine_setup.ansible_utils
> ansible_utils._process_output:94 local_vm_ip: {'stderr_lines': [],
> u'changed': True, u'end': u'2018-10-24 15:36:13.196641', u'stdout': u'',
> u'cmd': u"virsh -r net-dhcp-leases default | grep -i 00:16:3e:3d:13:d8 |
> awk '{ print $5 }' | cut -f1 -d'/'", 'failed': False, u'delta':
> u'0:00:00.054641', u'stderr': u'', u'rc': 0, 'stdout_lines': [], u'start':
> u'2018-10-24 15:36:13.142000'}
>
>
>
> Can you please share the output of
>
> virsh -r net-dhcp-leases default
>
>
>
> Are you running nested on a different hypervisor?
>
>
>
>
>
> Many thanks,
>
> Brendan
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 24 October 2018 08:44
> *To:* B Holmes <***@brendanh.com>
> *Cc:* users <***@ovirt.org>
> *Subject:* Re: [ovirt-users] Re: Diary of hosted engine install woes
>
>
>
> Can you please retry installing
> https://jenkins.ovirt.org/job/ovirt-hosted-engine-setup_4.2_build-artifacts-el7-x86_64/157/artifact/exported-artifacts/ovirt-hosted-engine-setup-2.2.29-1.el7.noarch.rpm
> and using it interactively without manually filing an answerfile?
>
> On Tue, Oct 23, 2018 at 11:19 PM Brendan Holmes <***@brendanh.com> wrote:
>
> Hi Simone,
>
>
>
> I attempted to use a fixed IP address instead of DHCP. I attach an
> attempt using answer-file parameter:
>
> OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24
>
> which is in accordance with this example on your website:
>
>
> https://ovirt.org/develop/release-management/features/integration/heapplianceflow/
>
>
>
> However, instead of this, you will notice that the log has:
>
> OVEHOSTED_VM/cloudinitVMStaticCIDR=bool:'False'
>
>
>
> The same error as usual occurred. Is this:
>
> 1. The wrong syntax\format for specifying a “cloudinitVMStaticCIDR” IP?
> 2. ovirt setup is ignoring it: either failing to assign the IP to the
> engine VM or to variable “local_vm_ip.stdout_lines”?
> 3. Something else?
>
>
>
> “You can also connect to that VM with VNC to check what's going on.” –
> Doesn’t work. Port querying 10.0.0.109 on 5800\5900 fails, so I don’t
> think the VM is networked. Which port does engine VNC use?
>
>
>
> Many thanks,
>
> Brendan
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 23 October 2018 09:09
> *To:* B Holmes <***@brendanh.com>
> *Cc:* users <***@ovirt.org>
> *Subject:* Re: [ovirt-users] Re: Diary of hosted engine install woes
>
>
>
> OK,
>
> according to this log file, the target engine VM didn't got an IPv4
> address:
>
>
>
>
>
>
>
> Are you sure that your DHCP server is properly working?
>
> You can also connect to that VM with VNC to check what's going on.
>
>
>
> On Mon, Oct 22, 2018 at 10:37 PM Brendan Holmes <***@brendanh.com> wrote:
>
> Hi Simone,
>
>
>
> Pls see attached the log without any of my variable-enumeration changes.
>
>
>
> Many thanks,
>
> Brendan
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 22 October 2018 13:33
> *To:* B Holmes <***@brendanh.com>
> *Cc:* users <***@ovirt.org>
> *Subject:* Re: [ovirt-users] Re: Diary of hosted engine install woes
>
>
>
>
>
> On Sun, Oct 21, 2018 at 11:50 PM Brendan Holmes <***@brendanh.com> wrote:
>
> Hi Simone,
>
>
>
> Sorry for late response, I’ve been unwell. Attached is the whole log you
> requested. I hope it reveals why variable “local_vm_ip.stdout_lines” is
> not being populated.
>
>
>
> Hi,
>
> according to your log file the deployment was successful:
>
>
>
> at this point I can only suggest again to simply run the vanilla code
> without any custom patch.
>
>
>
> *From:* Simone Tiraboschi <***@redhat.com>
> *Sent:* 15 October 2018 04:49
>
> On Sun, Oct 14, 2018 at 6:08 AM Brendan Holmes <***@brendanh.com> wrote:
>
> Hi Simone,
>
> [ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an
> option with an undefined variable. The error was: list object has no
> element 0\n\nThe error appears to have been in
> '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line
> 396, column 5, but may\nbe elsewhere in the file depending on the exact
> syntax problem.\n\nThe offending line appears to be:\n\n changed_when:
> True\n - name: Clean /etc/hosts on the host\n ^ here\n"}
>
>
>
> I think that this is just an error in the clean up on failures phase due
> to something bad before.
>
> Can you please attach the whole deployment log from
> /var/log/ovirt-hosted-engine/hosted-engine.log ?
>
>
>
>
Brendan Holmes
2018-10-27 19:32:38 UTC
Permalink
Hi Simone,



“bootstrap VMs for some reasons fails to get an address from DHCP over libvirt default natted network”.



I respectfully disagree. Line 1784 of the most recent log I sent you is:

2018-10-24 14:12:26,524+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 local_vm_ip: {'stderr_lines': [], u'changed': True, u'end': u'2018-10-24 14:12:25.437958', u'stdout': u'192.168.124.175', u'cmd': u"virsh -r net-dhcp-leases default | grep -i 00:16:3e:3d:13:d8 | awk '{ print $5 }' | cut -f1 -d'/'", 'failed': False, 'attempts': 2, u'stderr': u'', u'rc': 0, u'delta': u'0:00:00.070076', 'stdout_lines': [u'192.168.124.175'], u'start': u'2018-10-24 14:12:25.367882'}

I am able to ping the natted IP.

Also I wrote after an earlier attempt: “ovirt-engine.example.com resolves to a natted IP: 192.168.124.51”. (https://www.mail-archive.com/***@ovirt.org/msg51515.html).



The process fails as the bootstrap VM is moved to the final engine VM on the shared storage. I still think if I solve the DHCP problem on my network, the engine VM will be successfully created. Are you able to confirm the Node VM (https://www.ovirt.org/node/) has DHCP enabled? If so, then since this has the same DHCP problem as the engine VM (I have to type “dhclient” each boot to get an IP), I can use Node to troubleshoot.



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com>
Sent: 26 October 2018 08:39
To: B Holmes <***@brendanh.com>
Cc: users <***@ovirt.org>
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes





On Thu, Oct 25, 2018 at 4:29 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

“The issue here is that it seams that your VM never got an address from libvirt default network DHCP” – but aren’t I obviating DHCP by using parameter:

OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24 <http://10.0.0.109/24>

? VM should receive a static IP instead.



Perhaps there is a bug in hosted-engine setup that forces it to require DHCP.



No, this is expected.

The deploy process will launch a bootstrap local VM that will get a temporary address via DHCP over libvirt default natted network.

The engine running on that VM will be used to configure the environment and create a VM on the shared storage.

Only at the end, the local VM will be shutdown and its disk moved over the disk of the VM created by the engine on the shared storage.



In your case it seams that the bootstrap VMs for some reasons fails to get an address from DHCP over libvirt default natted network.

I'd suggest to try connecting to that VM via VNC to see what's going on there.



If you have no more suggestions, I will buy a new router with different DHCP server.



Many thanks,

Brendan



From: Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Sent: 24 October 2018 16:48
To: 'Simone Tiraboschi' <***@redhat.com <mailto:***@redhat.com> >
Cc: 'users' <***@ovirt.org <mailto:***@ovirt.org> >
Subject: RE: [ovirt-users] Re: Diary of hosted engine install woes



Hi Simone,



[***@host ~]# virsh -r net-dhcp-leases default

Expiry Time MAC address Protocol IP address Hostname Client ID or DUID

-------------------------------------------------------------------------------------------------------------------



[***@host ~]#



No, node is running on bare-metal. Node was installed by plugging in a bootable USB.



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 24 October 2018 16:11
To: B Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Cc: users <***@ovirt.org <mailto:***@ovirt.org> >
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes





On Wed, Oct 24, 2018 at 4:56 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



My attempt to install the RPM output hundreds of lines similar to:

/usr/lib/python2.7/site-packages/ovirt_hosted_engine_setup/__init__.pyc from install of ovirt-hosted-engine-setup-2.2.29-1.el7.noarch conflicts with file from package ovirt-hosted-engine-setup-2.2.26-1.el7.noarch



Do I need to install the matching version of ovirt node?



“using it interactively without manually filing an answerfile” – if you mean using command: hosted-engine –deploy

and then manually typing each value, I tried this (choosing the static IP option) and the usual error occurs. I attach the log from this attempt.



The issue here is that it seams that your VM never got an address from libvirt default network DHCP:

2018-10-24 15:36:14,370+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 local_vm_ip: {'stderr_lines': [], u'changed': True, u'end': u'2018-10-24 15:36:13.196641', u'stdout': u'', u'cmd': u"virsh -r net-dhcp-leases default | grep -i 00:16:3e:3d:13:d8 | awk '{ print $5 }' | cut -f1 -d'/'", 'failed': False, u'delta': u'0:00:00.054641', u'stderr': u'', u'rc': 0, 'stdout_lines': [], u'start': u'2018-10-24 15:36:13.142000'}



Can you please share the output of

virsh -r net-dhcp-leases default



Are you running nested on a different hypervisor?





Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 24 October 2018 08:44
To: B Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Cc: users <***@ovirt.org <mailto:***@ovirt.org> >
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes



Can you please retry installing https://jenkins.ovirt.org/job/ovirt-hosted-engine-setup_4.2_build-artifacts-el7-x86_64/157/artifact/exported-artifacts/ovirt-hosted-engine-setup-2.2.29-1.el7.noarch.rpm and using it interactively without manually filing an answerfile?

On Tue, Oct 23, 2018 at 11:19 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



I attempted to use a fixed IP address instead of DHCP. I attach an attempt using answer-file parameter:

OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24 <http://10.0.0.109/24>

which is in accordance with this example on your website:

https://ovirt.org/develop/release-management/features/integration/heapplianceflow/



However, instead of this, you will notice that the log has:

OVEHOSTED_VM/cloudinitVMStaticCIDR=bool:'False'



The same error as usual occurred. Is this:

1. The wrong syntax\format for specifying a “cloudinitVMStaticCIDR” IP?
2. ovirt setup is ignoring it: either failing to assign the IP to the engine VM or to variable “local_vm_ip.stdout_lines”?
3. Something else?



“You can also connect to that VM with VNC to check what's going on.” – Doesn’t work. Port querying 10.0.0.109 on 5800\5900 fails, so I don’t think the VM is networked. Which port does engine VNC use?



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 23 October 2018 09:09
To: B Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Cc: users <***@ovirt.org <mailto:***@ovirt.org> >
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes



OK,

according to this log file, the target engine VM didn't got an IPv4 address:







Are you sure that your DHCP server is properly working?

You can also connect to that VM with VNC to check what's going on.



On Mon, Oct 22, 2018 at 10:37 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



Pls see attached the log without any of my variable-enumeration changes.



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 22 October 2018 13:33
To: B Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Cc: users <***@ovirt.org <mailto:***@ovirt.org> >
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes





On Sun, Oct 21, 2018 at 11:50 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



Sorry for late response, I’ve been unwell. Attached is the whole log you requested. I hope it reveals why variable “local_vm_ip.stdout_lines” is not being populated.



Hi,

according to your log file the deployment was successful:



at this point I can only suggest again to simply run the vanilla code without any custom patch.



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 15 October 2018 04:49

On Sun, Oct 14, 2018 at 6:08 AM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,

[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n"}



I think that this is just an error in the clean up on failures phase due to something bad before.

Can you please attach the whole deployment log from /var/log/ovirt-hosted-engine/hosted-engine.log ?
Brendan Holmes
2018-10-14 16:28:00 UTC
Permalink
Hi Simone,



Here is the value of local_vm_ip during the bootstrap VM phase:

TASK [Create local VM]

TASK [Get local VM IP]:

2018-10-14 11:50:40,831+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 local_vm_ip: {'stderr_lines': [], u'changed': True, u'end': u'2018-10-14 11:50:39.860036', u'stdout': u'192.168.124.16', u'cmd': u"virsh -r net-dhcp-leases default | grep -i 00:16:3e:31:d3:9e | awk '{ print $5 }' | cut -f1 -d'/'", 'failed': False, 'attempts': 2, u'stderr': u'', u'rc': 0, u'delta': u'0:00:00.058237', 'stdout_lines': [u'192.168.124.16'], u'start': u'2018-10-14 11:50:39.801799'}



And this is its value at the moment the setup crashes (at the usual place – “Clean /etc/hosts on the host”):

2018-10-14 12:58:23,373+0100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:94 local_vm_ip: {'stderr_lines': [], u'changed': True, u'end': u'2018-10-14 12:58:22.193727', u'stdout': u'', u'cmd': u"virsh -r net-dhcp-leases default | grep -i 00:16:3e:31:d3:9e | awk '{ print $5 }' | cut -f1 -d'/'", 'failed': False, u'delta': u'0:00:00.069294', u'stderr': u'', u'rc': 0, 'stdout_lines': [], u'start': u'2018-10-14 12:58:22.124433'}



stdout_lines has disappeared.



After the error, I ran command:

[***@host04 ansible]# virsh -r net-dhcp-leases default

Expiry Time MAC address Protocol IP address Hostname Client ID or DUID

-------------------------------------------------------------------------------------------------------------------



[***@host04 ansible]#



Is the problem that I have assigned a static IP (using cloudinitVMStaticCIDR), yet the command ansible is running to obtain the IP address (virsh -r net-dhcp-leases) is only getting IPs obtained by DHCP, not static? Perhaps the bootstrap VM always obtains an DHCP IP on the natted network, so this command works at that earlier stage. Your documentation doesn’t mention a net-xxx command to obtain a list of static IPs.



Not sure if its helpful but here’s the output of a net-dumpxml command:

[***@host04 ansible]# virsh -r net-dumpxml default

<network>

<name>default</name>

<uuid>91fe2eee-c36e-4d08-9928-bff2d036aca5</uuid>

<forward mode='nat'>

<nat>

<port start='1024' end='65535'/>

</nat>

</forward>

<bridge name='virbr0' stp='on' delay='0'/>

<mac address='52:54:00:22:1b:4b'/>

<ip address='192.168.124.1' netmask='255.255.255.0'>

<dhcp>

<range start='192.168.124.2' end='192.168.124.254'/>

</dhcp>

</ip>

</network>



[***@host04 ansible]#



Many thanks,

Brendan



From: Brendan Holmes <***@brendanh.com>
Sent: 13 October 2018 23:09
To: 'Simone Tiraboschi' <***@redhat.com>
Cc: 'users' <***@ovirt.org>
Subject: RE: [ovirt-users] Re: Diary of hosted engine install woes



Hi Simone,



“restart from scratch deploying with a static IP”. Okay, I have reinstalled the host using oVirt Node from scratch. I am assigning a static IP using the attached answers.conf which contains:

OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24



create_target_vm.yml and all other RedHat code is as-shipped. I’m getting the same error:



[ INFO ] TASK [Copy /etc/hosts back to the Hosted Engine VM]

[ INFO ] changed: [localhost]

[ INFO ] TASK [Copy local VM disk to shared storage]

[ INFO ] changed: [localhost]

[ INFO ] TASK [Clean /etc/hosts on the host]

[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n"}



Any ideas? How can I debug this failure to assign an IP (undefined variable)?



Many thanks,

Brendan



From: Simone Tiraboschi <***@redhat.com <mailto:***@redhat.com> >
Sent: 10 October 2018 12:06
To: B Holmes <***@brendanh.com <mailto:***@brendanh.com> >
Cc: users <***@ovirt.org <mailto:***@ovirt.org> >
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes





On Tue, Oct 9, 2018 at 11:50 PM Brendan Holmes <***@brendanh.com <mailto:***@brendanh.com> > wrote:

Hi Simone,



Yes the MAC address in answers.conf: OVEHOSTED_VM/vmMACAddr=

is added as a reservation to the DHCP server, so in theory 10.0.0.109 should be assigned.



However perhaps DHCP is not working. I have just changed to a static IP instead:

OVEHOSTED_VM/cloudinitVMStaticCIDR=str:10.0.0.109/24 <http://10.0.0.109/24>

(let me know if this isn’t the correct way)



My host fails to get an IP automatically from this DHCP server, so it is quite possible engine’s DHCP has been failing too. Each time the host boots, I must type dhclient in order to receive an IP address. Anyway, after changing this and re-running hosted-engine –deploy, failed due to:



[ INFO ] TASK [Copy local VM disk to shared storage]

[ INFO ] changed: [localhost]

[ INFO ] TASK [show local_vm_ip.std_out_lines[0] that will be written to etc hosts]

[ INFO ] ok: [localhost]

[ INFO ] TASK [show FQDN]

[ INFO ] ok: [localhost]

[ INFO ] TASK [Clean /etc/hosts on the host]

[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 400, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n debug: var=FQDN\n - name: Clean /etc/hosts on the host\n ^ here\n"}



I have just tried deploying using the webui, same error. I suspect the “undefined variable” is local_vm_ip.std_out_lines[0]. My new debug task that tries to output this is:

- name: show local_vm_ip.std_out_lines[0] that will be written to etc hosts

debug: var=local_vm_ip.stdout_lines[0]



You can see the output of this above. I think I was mistaken to suggest the value of this is localhost. Localhost is just the machine this task ran on. I don’t think list local_vm_ip.std_out_lines is defined. Any more ideas?



The issue is on a task that isn't part of the code we are shipping.

I can just suggest to simply reinstall the rpm to get rid of any modification and restart from scratch deploying with a static IP if your DHCP server is not properly working.





Many thanks



From: Simone Tiraboschi < <mailto:***@redhat.com> ***@redhat.com>
Sent: 09 October 2018 16:51
To: B Holmes < <mailto:***@brendanh.com> ***@brendanh.com>
Cc: users < <mailto:***@ovirt.org> ***@ovirt.org>
Subject: Re: [ovirt-users] Re: Diary of hosted engine install woes





On Tue, Oct 9, 2018 at 4:54 PM <***@brendanh.com <mailto:***@brendanh.com> > wrote:

I'ved added a record to the DNS server here:
ovirt-engine.example.com <http://ovirt-engine.example.com> 10.0.0.109



OK, and how the engine VM will get that address?

Are you using DHCP? do you have a DHCP reservation for the MAC address you are using on the engine VM?

Are you configuring it with a static IP?




This IP address is on the physical network that the host is on (host is on 10.0.0.171). I trust this is correct and I should not resolve to a natted IP instead. I notice that regardless of this record, the name ovirt-engine.example.com <http://ovirt-engine.example.com> resolves to a natted IP: 192.168.124.51 because the ansible script adds an entry to /etc/hosts:
192.168.124.51 ovirt-engine.example.com <http://ovirt-engine.example.com>
While the script is running, if I I can successfully ping ovirt-engine.example.com <http://ovirt-engine.example.com> , it responds on 192.168.124.51. So as you say: "host can correctly resolve the name of the engine VM", but it's not the DNS record's IP. If I remove the DNS record and run hosted-engine --deploy, I get error:
[ ERROR ] Host name is not valid: ovirt-engine.example.com <http://ovirt-engine.example.com> did not resolve into an IP address

Anyway, I added back the DNS record and ran hosted-engine --deploy command, it failed at:
[ INFO ] TASK [Clean /etc/hosts on the host]
[ ERROR ] fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: list object has no element 0\n\nThe error appears to have been in '/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml': line 396, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n changed_when: True\n - name: Clean /etc/hosts on the host\n ^ here\n"}

To debug, I added tasks to create_target_vm.yml that output the values of local_vm_ip.std_out_lines[0] and FQDN that are used in this task, then ran the usual deploy command again. They are both localhost:
[ INFO ] TASK [show local_vm_ip.std_out_lines[0] that will be written to etc hosts]
[ INFO ] ok: [localhost]
[ INFO ] TASK [show FQDN]
[ INFO ] ok: [localhost]

This time, it gets past [Clean /etc/hosts on the host], but hangs at [ INFO ] TASK [Check engine VM health] same as before.



This is fine, the bootstrap local VM runs over a natted network then, once ready it will be shutdown and moved to the shared storage. At that point it will be restarted on your management network.



I catted /etc/hosts while it was hanging and it contains:
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost localhost.localdomain localhost6 localhost6.localdomain6

The ovirt-engine.example.com <http://ovirt-engine.example.com> has been deleted! I pinged ovirt-engine.example.com <http://ovirt-engine.example.com> and it now resolves to its IP on the physical network: 10.0.0.109. So I added back this /etc/hosts entry:
192.168.124.51 ovirt-engine.example.com <http://ovirt-engine.example.com>



Please avoid this.




It subsequently errored:
[ ERROR ] fatal: [localhost]: FAILED! => {"attempts": 120, "changed": true, "cmd": ["hosted-engine", "--vm-status", "--json"], "delta": "0:00:00.167559", "end": "2018-10-09 15:43:41.947274", "rc": 0, "start": "2018-10-09 15:43:41.779715", "stderr": "", "stderr_lines": [], "stdout": "{\"1\": {\"conf_on_shared_storage\": true, \"live-data\": true, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=6810 (Tue Oct 9 15:43:36 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=6810 <file://nhost-id=1/nscore=3400/nvm_conf_refresh_time=6810> (Tue Oct 9 15:43:37 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\ <file://nconf_on_shared_storage=True/nmaintenance=False/nstate=EngineStarting/nstopped=False/n/> ", \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\": \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\", \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\": false, \"crc32\": \"c5d76f8b\", \"local_conf_timestamp\": 6810, \"host-ts\": 6810}, \"global_maintenance\": false}", "stdout_lines": ["{\"1\": {\"conf_
on_shared_storage\": true, \"live-data\": true, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=6810 (Tue Oct 9 15:43:36 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=6810 <file://nhost-id=1/nscore=3400/nvm_conf_refresh_time=6810> (Tue Oct 9 15:43:37 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\ <file://nconf_on_shared_storage=True/nmaintenance=False/nstate=EngineStarting/nstopped=False/n/> ", \"hostname\": \"host\", \"host-id\": 1, \"engine-status\": {\"reason\": \"failed liveliness check\", \"health\": \"bad\", \"vm\": \"up\", \"detail\": \"Up\"}, \"score\": 3400, \"stopped\": false, \"maintenance\": false, \"crc32\": \"c5d76f8b\", \"local_conf_timestamp\": 6810, \"host-ts\": 6810}, \"global_maintenance\": false}"]}

How can I check the hosted-engine's IP address to ensure name resolution is correct?



You can connect to that VM with VNC and check the IP there.



_______________________________________________
Users mailing list -- ***@ovirt.org <mailto:***@ovirt.org>
To unsubscribe send an email to users-***@ovirt.org <mailto:users-***@ovirt.org>
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: https://www.ovirt.org/community/about/community-guidelines/
List Archives: https://lists.ovirt.org/archives/list/***@ovirt.org/message/SVBXIBLS5TSP7SZROSSE6JD5ICBZLV3E/
Loading...