Discussion:
[ovirt-users] Error Deployment New Hosted Engine on 4.4.1 node
Dominique D
2021-05-13 13:16:30 UTC
Permalink
I have a setup with 3 servers production environment with version 4.4.1 that works very well (gluster and hosted-engine)

I prepared a lab with the same setup (with the same ISO) to test a update to (4.4.6)

The deployment of the Hyperconverge works but when I want to do hosted-engine --deploy or in the cokpit, at some time the host displays this message and reboot and the engine is not installed.

I tried with the latest ISO 4.4.6 (Fresh Install) and I have the same message. Is there a network change with version 8.3 of CentOS?

[ 33.785588] bondscan-yhEyFK: option fail_over_mac: invalid value (3)
[ 33.787183] bondscan-yhEyFK: option arp_all_targets: invalid value (2)
[ 33.804832] bondscan-yhEyFK: option arp_validate: invalid value (7)
[ 33.807568] bondscan-yhEyFK: option xmit_hash_policy: invalid value (6)
[ 33.807884] bondscan-yhEyFK: option lacp_rate: mode dependency failed, not supported in mode broadcast(3)
[ 33.809858] bondscan-yhEyFK: option ad_select: invalid value (3)
[ 33.811003] bondscan-yhEyFK: option primary_reselect: invalid value (3)
[ 33.813490] bondscan-yhEyFK: option fail_over_mac: invalid value (3)
[ 33.815175] bondscan-yhEyFK: option arp_all_targets: invalid value (2)
[ 33.839036] bondscan-yhEyFK: option arp_validate: mode dependency failed, not supported in mode 802.3ad(4)
[ 33.841905] bondscan-yhEyFK: option xmit_hash_policy: invalid value (6)
[ 33.842628] bondscan-yhEyFK: option lacp_rate: invalid value (2)
[ 33.844573] bondscan-yhEyFK: option ad_select: invalid value (3)
[ 33.846293] bondscan-yhEyFK: option primary_reselect: invalid value (3)
[ 33.847492] bondscan-yhEyFK: option fail_over_mac: invalid value (3)
[ 33.848978] bondscan-yhEyFK: option arp_all_targets: invalid value (2)
[ 33.871465] bondscan-yhEyFK: option arp_validate: mode dependency failed, not supported in mode balance-tlb(5)
[ 33.874149] bondscan-yhEyFK: option xmit_hash_policy: invalid value (6)
[ 33.874475] bondscan-yhEyFK: option lacp_rate: mode dependency failed, not supported in mode balance-tlb(5)
[ 33.876514] bondscan-yhEyFK: option ad_select: invalid value (3)
[ 33.877381] bondscan-yhEyFK: option primary_reselect: invalid value (3)
[ 33.878554] bondscan-yhEyFK: option fail_over_mac: invalid value (3)
[ 33.879991] bondscan-yhEyFK: option arp_all_targets: invalid value (2)
[ 33.909461] bondscan-yhEyFK: option arp_validate: mode dependency failed, not supported in mode balance-alb(6)
[ 33.914593] bondscan-yhEyFK: option xmit_hash_policy: invalid value (6)
[ 33.916657] bondscan-yhEyFK: option lacp_rate: mode dependency failed, not supported in mode balance-alb(6)
[ 33.920665] bondscan-yhEyFK: option ad_select: invalid value (3)
[ 33.927749] bondscan-yhEyFK: option primary_reselect: invalid value (3)
[ 33.932404] bondscan-yhEyFK: option fail_over_mac: invalid value (3)
[ 33.935589] bondscan-yhEyFK: option arp_all_targets: invalid value(2)
_______________________________________________
Users mailing list -- ***@ovirt.org
To unsubscribe send an email to users-***@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: https://www.ovirt.org/community/about/community-guidelines/
List Archives: https://lists.ovirt.org/archives/list/users@
Ales Musil
2021-05-14 05:14:24 UTC
Permalink
On Thu, May 13, 2021 at 3:33 PM Dominique D <
Post by Dominique D
I have a setup with 3 servers production environment with version 4.4.1
that works very well (gluster and hosted-engine)
I prepared a lab with the same setup (with the same ISO) to test a update to (4.4.6)
The deployment of the Hyperconverge works but when I want to do
hosted-engine --deploy or in the cokpit, at some time the host displays
this message and reboot and the engine is not installed.
I tried with the latest ISO 4.4.6 (Fresh Install) and I have the same
message. Is there a network change with version 8.3 of CentOS?
[ 33.785588] bondscan-yhEyFK: option fail_over_mac: invalid value (3)
[ 33.787183] bondscan-yhEyFK: option arp_all_targets: invalid value (2)
[ 33.804832] bondscan-yhEyFK: option arp_validate: invalid value (7)
[ 33.807568] bondscan-yhEyFK: option xmit_hash_policy: invalid value (6)
[ 33.807884] bondscan-yhEyFK: option lacp_rate: mode dependency failed,
not supported in mode broadcast(3)
[ 33.809858] bondscan-yhEyFK: option ad_select: invalid value (3)
[ 33.811003] bondscan-yhEyFK: option primary_reselect: invalid value (3)
[ 33.813490] bondscan-yhEyFK: option fail_over_mac: invalid value (3)
[ 33.815175] bondscan-yhEyFK: option arp_all_targets: invalid value (2)
[ 33.839036] bondscan-yhEyFK: option arp_validate: mode dependency
failed, not supported in mode 802.3ad(4)
[ 33.841905] bondscan-yhEyFK: option xmit_hash_policy: invalid value (6)
[ 33.842628] bondscan-yhEyFK: option lacp_rate: invalid value (2)
[ 33.844573] bondscan-yhEyFK: option ad_select: invalid value (3)
[ 33.846293] bondscan-yhEyFK: option primary_reselect: invalid value (3)
[ 33.847492] bondscan-yhEyFK: option fail_over_mac: invalid value (3)
[ 33.848978] bondscan-yhEyFK: option arp_all_targets: invalid value (2)
[ 33.871465] bondscan-yhEyFK: option arp_validate: mode dependency
failed, not supported in mode balance-tlb(5)
[ 33.874149] bondscan-yhEyFK: option xmit_hash_policy: invalid value (6)
[ 33.874475] bondscan-yhEyFK: option lacp_rate: mode dependency failed,
not supported in mode balance-tlb(5)
[ 33.876514] bondscan-yhEyFK: option ad_select: invalid value (3)
[ 33.877381] bondscan-yhEyFK: option primary_reselect: invalid value (3)
[ 33.878554] bondscan-yhEyFK: option fail_over_mac: invalid value (3)
[ 33.879991] bondscan-yhEyFK: option arp_all_targets: invalid value (2)
[ 33.909461] bondscan-yhEyFK: option arp_validate: mode dependency
failed, not supported in mode balance-alb(6)
[ 33.914593] bondscan-yhEyFK: option xmit_hash_policy: invalid value (6)
[ 33.916657] bondscan-yhEyFK: option lacp_rate: mode dependency failed,
not supported in mode balance-alb(6)
[ 33.920665] bondscan-yhEyFK: option ad_select: invalid value (3)
[ 33.927749] bondscan-yhEyFK: option primary_reselect: invalid value (3)
[ 33.932404] bondscan-yhEyFK: option fail_over_mac: invalid value (3)
[ 33.935589] bondscan-yhEyFK: option arp_all_targets: invalid value(2)
Hi,

this is a harmless message produced by vdsm when we scan available bond
options.

Did you look into hosted engine deploy logs if there is any hint what might
be wrong?

Thanks,
Ales
Post by Dominique D
_______________________________________________
Privacy Statement: https://www.ovirt.org/privacy-policy.html
https://www.ovirt.org/community/about/community-guidelines/
--
Ales Musil

Software Engineer - RHV Network

Red Hat EMEA <https://www.redhat.com>

***@redhat.com IM: amusil
<https://red.ht/sig>
Dominique D
2021-05-17 14:23:11 UTC
Permalink
I have this message log :

2021-05-17 09:25:23,453-0400 INFO ansible task start {'status': 'OK', 'ansible_type': 'task', 'ansible_playbook': '/usr/share/ovirt-hosted-engine-setup/ansible/trigger_role.yml', 'ansible_task': 'ovirt.hosted_engine_setup : Always revoke the SSO token'}
2021-05-17 09:25:23,453-0400 DEBUG ansible on_any args TASK: ovirt.hosted_engine_setup : Always revoke the SSO token kwargs is_conditional:False
2021-05-17 09:25:23,454-0400 DEBUG ansible on_any args localhostTASK: ovirt.hosted_engine_setup : Always revoke the SSO token kwargs
2021-05-17 09:25:26,338-0400 ERROR ansible failed {
"ansible_host": "localhost",
"ansible_playbook": "/usr/share/ovirt-hosted-engine-setup/ansible/trigger_role.yml",
"ansible_result": {
"_ansible_no_log": false,
"changed": false,
"invocation": {
"module_args": {
"ca_file": null,
"compress": true,
"headers": null,
"hostname": null,
"insecure": null,
"kerberos": false,
"ovirt_auth": {
"ansible_facts": {
"ovirt_auth": {
"ca_file": null,
"compress": true,
"headers": null,
"insecure": true,
"kerberos": false,
"timeout": 0,
"token": "RbzZ0C-JVaKvywnQAPWfZUWP4q4lWTnrEi0VsyIMBisPK-s00KX6xUw2dAH2oc0PHnkhtiWy8UBbAi7C7I3v_w",
"url": "https://oe.telecom.lan/ovirt-engine/api"
}
},
"attempts": 1,
"changed": false,
"failed": false
},
"password": null,
"state": "absent",
"timeout": 0,
"token": null,
"url": null,
"username": null
}
},
"msg": "You must specify either 'url' or 'hostname'."
},
"ansible_task": "Always revoke the SSO token",
"ansible_type": "task",
"status": "FAILED",
"task_duration": 4
}
2021-05-17 09:25:26,338-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e6316828> kwargs ignore_errors:True
2021-05-17 09:25:28,130-0400 INFO ansible task start {'status': 'OK', 'ansible_type': 'task', 'ansible_playbook': '/usr/share/ovirt-hosted-engine-setup/ansible/trigger_role.yml', 'ansible_task': 'ovirt.hosted_engine_setup : include_tasks'}
2021-05-17 09:25:28,130-0400 DEBUG ansible on_any args TASK: ovirt.hosted_engine_setup : include_tasks kwargs is_conditional:False
2021-05-17 09:25:28,132-0400 DEBUG ansible on_any args localhostTASK: ovirt.hosted_engine_setup : include_tasks kwargs
2021-05-17 09:25:29,760-0400 INFO ansible ok {'status': 'OK', 'ansible_type': 'task', 'ansible_playbook': '/usr/share/ovirt-hosted-engine-setup/ansible/trigger_role.yml', 'ansible_host': 'localhost', 'ansible_task': '', 'task_duration': 3}
2021-05-17 09:25:29,760-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e6324748> kwargs
2021-05-17 09:25:29,897-0400 DEBUG ansible on_any args /usr/share/ansible/roles/ovirt.hosted_engine_setup/tasks/auth_sso.yml (args={} vars={}): [localhost] kwargs
2021-05-17 09:25:31,738-0400 INFO ansible task start {'status': 'OK', 'ansible_type': 'task', 'ansible_playbook': '/usr/share/ovirt-hosted-engine-setup/ansible/trigger_role.yml', 'ansible_task': 'ovirt.hosted_engine_setup : Obtain SSO token using username/password credentials'}
2021-05-17 09:25:31,739-0400 DEBUG ansible on_any args TASK: ovirt.hosted_engine_setup : Obtain SSO token using username/password credentials kwargs is_conditional:False
2021-05-17 09:25:31,740-0400 DEBUG ansible on_any args localhostTASK: ovirt.hosted_engine_setup : Obtain SSO token using username/password credentials kwargs
2021-05-17 09:25:33,871-0400 DEBUG var changed: host "localhost" var "ovirt_auth" type "<class 'dict'>" value: "{
"ca_file": null,
"compress": true,
"headers": null,
"insecure": true,
"kerberos": false,
"timeout": 0,
"token": "yOwJzQ9dWwHkzvEhLAgKkelJ8m5FqN0mucEj720NUcjqpAawFnjxDkAQnr_j03jJJFcV26LAn82vFvaPwYND5A",
"url": "https://oe.telecom.lan/ovirt-engine/api"
}"
2021-05-17 09:25:33,872-0400 INFO ansible ok {'status': 'OK', 'ansible_type': 'task', 'ansible_playbook': '/usr/share/ovirt-hosted-engine-setup/ansible/trigger_role.yml', 'ansible_host': 'localhost', 'ansible_task': 'Obtain SSO token using username/password credentials', 'task_duration': 3}
2021-05-17 09:25:33,872-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e6366d68> kwargs
2021-05-17 09:25:35,930-0400 DEBUG var changed: host "localhost" var "ovirt_sso_auth" type "<class 'dict'>" value: "{
"ansible_facts": {
"ovirt_auth": {
"ca_file": null,
"compress": true,
"headers": null,
"insecure": true,
"kerberos": false,
"timeout": 0,
"token": "yOwJzQ9dWwHkzvEhLAgKkelJ8m5FqN0mucEj720NUcjqpAawFnjxDkAQnr_j03jJJFcV26LAn82vFvaPwYND5A",
"url": "https://oe.telecom.lan/ovirt-engine/api"
}
},
"attempts": 1,
"changed": false,
"failed": false
}"
2021-05-17 09:25:35,930-0400 INFO ansible task start {'status': 'OK', 'ansible_type': 'task', 'ansible_playbook': '/usr/share/ovirt-hosted-engine-setup/ansible/trigger_role.yml', 'ansible_task': 'ovirt.hosted_engine_setup : Wait for the host to be up'}
2021-05-17 09:25:35,930-0400 DEBUG ansible on_any args TASK: ovirt.hosted_engine_setup : Wait for the host to be up kwargs is_conditional:False
2021-05-17 09:25:35,931-0400 DEBUG ansible on_any args localhostTASK: ovirt.hosted_engine_setup : Wait for the host to be up kwargs
2021-05-17 09:25:36,964-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e638c5f8> kwargs
2021-05-17 09:25:47,793-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e6369d30> kwargs
2021-05-17 09:25:58,789-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e653eac8> kwargs
2021-05-17 09:26:09,720-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e6783c88> kwargs
2021-05-17 09:26:20,678-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e638ce48> kwargs
2021-05-17 09:26:31,612-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e6321940> kwargs
2021-05-17 09:26:42,482-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e6334ac8> kwargs
2021-05-17 09:26:53,280-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e6369eb8> kwargs
2021-05-17 09:27:04,223-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e64ee780> kwargs
2021-05-17 09:27:15,093-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e67de0b8> kwargs
2021-05-17 09:27:26,047-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e6494c50> kwargs
2021-05-17 09:27:36,817-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e6544588> kwargs
2021-05-17 09:27:47,766-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e653ee10> kwargs
2021-05-17 09:27:58,717-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e62c1860> kwargs
2021-05-17 09:28:09,670-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e65a5828> kwargs
2021-05-17 09:28:20,540-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e668f9b0> kwargs
2021-05-17 09:28:31,302-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e63f8550> kwargs
2021-05-17 09:28:42,258-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e6366e80> kwargs
2021-05-17 09:28:53,196-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e65a5fd0> kwargs
2021-05-17 09:29:04,110-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e676b780> kwargs
2021-05-17 09:29:14,878-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e638cc50> kwargs
2021-05-17 09:29:25,807-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e666a0b8> kwargs
2021-05-17 09:29:36,764-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e64eeb70> kwargs
2021-05-17 09:29:47,738-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e638c390> kwargs
2021-05-17 09:29:58,122-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e6540b38> kwargs
2021-05-17 09:30:08,837-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e63fac88> kwargs
2021-05-17 09:30:19,696-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e62f6860> kwargs
2021-05-17 09:30:30,365-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e651b1d0> kwargs
2021-05-17 09:30:41,301-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e6307b00> kwargs
2021-05-17 09:30:52,053-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e63544e0> kwargs
2021-05-17 09:31:02,807-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e62f4080> kwargs
2021-05-17 09:31:13,759-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e63897b8> kwargs
2021-05-17 09:31:24,734-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e62e24a8> kwargs
2021-05-17 09:31:35,707-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e6389208> kwargs
2021-05-17 09:31:46,639-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e62a3b38> kwargs
2021-05-17 09:31:57,053-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e62c76d8> kwargs
2021-05-17 09:32:07,811-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e62e07f0> kwargs
2021-05-17 09:32:18,753-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e63020f0> kwargs
2021-05-17 09:32:29,602-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e62c7c18> kwargs
2021-05-17 09:32:40,355-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e62b6518> kwargs
2021-05-17 09:32:51,054-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e62db3c8> kwargs
2021-05-17 09:33:01,824-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e62c75c0> kwargs
2021-05-17 09:33:12,773-0400 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fb2e62a3518> kwargs
_______________________________________________
Users mailing list -- ***@ovirt.org
To unsubscribe send an email to users-***@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: https://www.ovirt.org/community/about/community-guidelines/
List Archives: https://lists.ovirt.org/archives/list/***@ovirt.org/message/X3GALWZ3SQERI4Y
Dominique D
2021-05-19 18:14:56 UTC
Permalink
I don't know where to look to find the solution :(.

Here are my logs :

https://drive.google.com/drive/folders/12dDBqoiFFqWnEG_JJGm4fT2RG8cbEHtF?usp=sharing
_______________________________________________
Users mailing list -- ***@ovirt.org
To unsubscribe send an email to users-***@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: https://www.ovirt.org/community/about/community-guidelines/
List Archives: https://lists.ovirt.org/archives/list/***@ovirt.or
Dominique D
2021-05-17 17:42:53 UTC
Permalink
I'm not trying to update the engine, I just want to install it.

it's a simple lab, i use one interface and one subnet for node dans engine and a different subnet for the dns server.

here the output of systemctl status NetworkManager

NetworkManager.service - Network Manager
Loaded: loaded (/usr/lib/systemd/system/NetworkManager.service; enabled; vendor preset: enabled)
Active: active (running) since Mon 2021-05-17 09:33:54 EDT; 4h 0min ago
Docs: man:NetworkManager(8)
Main PID: 1415 (NetworkManager)
Tasks: 3 (limit: 204123)
Memory: 10.1M
CGroup: /system.slice/NetworkManager.service
└─1415 /usr/sbin/NetworkManager --no-daemon

May 17 09:33:59 ov01.telecom.lan NetworkManager[1415]: <info> [1621258439.7693] manager: (bondscan-rg1vK3): new Bond device (/org/freedesktop/NetworkManager/Devices/12)
May 17 09:33:59 ov01.telecom.lan NetworkManager[1415]: <info> [1621258439.8012] manager: (bondscan-rg1vK3): new Bond device (/org/freedesktop/NetworkManager/Devices/13)
May 17 09:33:59 ov01.telecom.lan NetworkManager[1415]: <info> [1621258439.8254] manager: (bondscan-QjUsFZ): new Bond device (/org/freedesktop/NetworkManager/Devices/14)
May 17 09:33:59 ov01.telecom.lan NetworkManager[1415]: <info> [1621258439.8641] manager: (bondscan-QjUsFZ): new Bond device (/org/freedesktop/NetworkManager/Devices/15)
May 17 09:33:59 ov01.telecom.lan NetworkManager[1415]: <info> [1621258439.9078] manager: (bondscan-QjUsFZ): new Bond device (/org/freedesktop/NetworkManager/Devices/16)
May 17 09:33:59 ov01.telecom.lan NetworkManager[1415]: <info> [1621258439.9455] manager: (bondscan-QjUsFZ): new Bond device (/org/freedesktop/NetworkManager/Devices/17)
May 17 09:33:59 ov01.telecom.lan NetworkManager[1415]: <info> [1621258439.9974] manager: (bondscan-QjUsFZ): new Bond device (/org/freedesktop/NetworkManager/Devices/18)
May 17 09:34:00 ov01.telecom.lan NetworkManager[1415]: <info> [1621258440.0261] manager: (bondscan-QjUsFZ): new Bond device (/org/freedesktop/NetworkManager/Devices/19)
May 17 09:34:00 ov01.telecom.lan NetworkManager[1415]: <info> [1621258440.0702] manager: (bondscan-QjUsFZ): new Bond device (/org/freedesktop/NetworkManager/Devices/20)
May 17 09:34:32 ov01.telecom.lan NetworkManager[1415]: <info> [1621258472.2080] manager: (;vdsmdummy;): new Bridge device (/org/freedesktop/NetworkManager/Devices/21
_______________________________________________
Users mailing list -- ***@ovirt.org
To unsubscribe send an email to users-***@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: https://www.ovirt.org/community/about/community-guidelines/
List Archives: https://lists.ovirt.org/archives/list/user
Loading...