ansible-playbook [core 2.17.13] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-TBt executable location = /usr/local/bin/ansible-playbook python version = 3.12.11 (main, Jun 4 2025, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-7)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_default.yml **************************************************** 1 plays in /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/tests/ha_cluster/tests_default.yml PLAY [Ensure mandatory variables are defined] ********************************** TASK [Set up test environment] ************************************************* task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/tests/ha_cluster/tests_default.yml:10 Wednesday 20 August 2025 13:18:35 -0400 (0:00:00.039) 0:00:00.039 ****** included: fedora.linux_system_roles.ha_cluster for managed-node1 TASK [fedora.linux_system_roles.ha_cluster : Set node name to 'localhost' for single-node clusters] *** task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/test_setup.yml:9 Wednesday 20 August 2025 13:18:35 -0400 (0:00:00.049) 0:00:00.089 ****** ok: [managed-node1] => { "ansible_facts": { "inventory_hostname": "localhost" }, "changed": false } TASK [fedora.linux_system_roles.ha_cluster : Ensure facts used by tests] ******* task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/test_setup.yml:14 Wednesday 20 August 2025 13:18:35 -0400 (0:00:00.141) 0:00:00.231 ****** [WARNING]: Platform linux on host localhost is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed-node1] TASK [fedora.linux_system_roles.ha_cluster : Check if system is ostree] ******** task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/test_setup.yml:22 Wednesday 20 August 2025 13:18:36 -0400 (0:00:01.290) 0:00:01.521 ****** ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.ha_cluster : Set flag to indicate system is ostree] *** task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/test_setup.yml:27 Wednesday 20 August 2025 13:18:37 -0400 (0:00:00.790) 0:00:02.312 ****** ok: [managed-node1] => { "ansible_facts": { "__ha_cluster_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.ha_cluster : Do not try to enable RHEL repositories] *** task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/test_setup.yml:32 Wednesday 20 August 2025 13:18:37 -0400 (0:00:00.091) 0:00:02.404 ****** skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution == 'RedHat'", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ha_cluster : Copy nss-altfiles ha_cluster users to /etc/passwd] *** task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/test_setup.yml:41 Wednesday 20 August 2025 13:18:38 -0400 (0:00:00.217) 0:00:02.621 ****** skipping: [managed-node1] => { "changed": false, "false_condition": "__ha_cluster_is_ostree | d(false)", "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/tests/ha_cluster/tests_default.yml:15 Wednesday 20 August 2025 13:18:38 -0400 (0:00:00.144) 0:00:02.766 ****** included: fedora.linux_system_roles.ha_cluster for managed-node1 TASK [fedora.linux_system_roles.ha_cluster : Set platform/version specific variables] *** task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/main.yml:3 Wednesday 20 August 2025 13:18:38 -0400 (0:00:00.171) 0:00:02.937 ****** included: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.ha_cluster : Ensure ansible_facts used by role] *** task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/set_vars.yml:2 Wednesday 20 August 2025 13:18:38 -0400 (0:00:00.105) 0:00:03.042 ****** skipping: [managed-node1] => { "changed": false, "false_condition": "__ha_cluster_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ha_cluster : Check if system is ostree] ******** task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/set_vars.yml:10 Wednesday 20 August 2025 13:18:38 -0400 (0:00:00.198) 0:00:03.241 ****** skipping: [managed-node1] => { "changed": false, "false_condition": "not __ha_cluster_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ha_cluster : Set flag to indicate system is ostree] *** task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/set_vars.yml:15 Wednesday 20 August 2025 13:18:38 -0400 (0:00:00.077) 0:00:03.319 ****** skipping: [managed-node1] => { "changed": false, "false_condition": "not __ha_cluster_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ha_cluster : Set platform/version specific variables] *** task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/set_vars.yml:19 Wednesday 20 August 2025 13:18:38 -0400 (0:00:00.084) 0:00:03.403 ****** ok: [managed-node1] => (item=RedHat.yml) => { "ansible_facts": { "__ha_cluster_cloud_agents_packages": {}, "__ha_cluster_fence_agent_packages_default": "{{ ['fence-agents-all'] + (['fence-virt'] if ansible_architecture == 'x86_64' else []) }}", "__ha_cluster_fullstack_node_packages": [ "corosync", "libknet1-plugins-all", "resource-agents", "pacemaker" ], "__ha_cluster_pcs_provider": "pcs-0.10", "__ha_cluster_qdevice_node_packages": [ "corosync-qdevice", "bash", "coreutils", "curl", "grep", "nss-tools", "openssl", "sed" ], "__ha_cluster_repos": [], "__ha_cluster_role_essential_packages": [ "pcs", "corosync-qnetd", "openssl" ], "__ha_cluster_sbd_packages": [ "sbd" ], "__ha_cluster_services": [ "corosync", "corosync-qdevice", "pacemaker" ] }, "ansible_included_var_files": [ "/tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/vars/RedHat.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_9.yml) => { "ansible_facts": { "__ha_cluster_cloud_agents_packages": { "aarch64": [ "fence-agents-ibm-powervs", "fence-agents-ibm-vpc", "fence-agents-kubevirt" ], "noarch": [ "fence-agents-ibm-powervs", "fence-agents-ibm-vpc" ], "ppc64le": [ "fence-agents-compute", "fence-agents-ibm-powervs", "fence-agents-ibm-vpc", "fence-agents-kubevirt", "fence-agents-openstack" ], "s390x": [ "fence-agents-ibm-powervs", "fence-agents-ibm-vpc", "fence-agents-kubevirt" ], "x86_64": [ "resource-agents-cloud", "fence-agents-aliyun", "fence-agents-aws", "fence-agents-azure-arm", "fence-agents-compute", "fence-agents-gce", "fence-agents-ibm-powervs", "fence-agents-ibm-vpc", "fence-agents-kubevirt", "fence-agents-openstack" ] }, "__ha_cluster_repos": [ { "id": "highavailability", "name": "HighAvailability" }, { "id": "resilientstorage", "name": "ResilientStorage" } ] }, "ansible_included_var_files": [ "/tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node1] => (item=CentOS_9.yml) => { "ansible_facts": { "__ha_cluster_cloud_agents_packages": { "aarch64": [ "fence-agents-ibm-powervs", "fence-agents-ibm-vpc", "fence-agents-kubevirt" ], "noarch": [ "fence-agents-ibm-powervs", "fence-agents-ibm-vpc" ], "ppc64le": [ "fence-agents-compute", "fence-agents-ibm-powervs", "fence-agents-ibm-vpc", "fence-agents-kubevirt", "fence-agents-openstack" ], "s390x": [ "fence-agents-ibm-powervs", "fence-agents-ibm-vpc", "fence-agents-kubevirt" ], "x86_64": [ "resource-agents-cloud", "fence-agents-aliyun", "fence-agents-aws", "fence-agents-azure-arm", "fence-agents-compute", "fence-agents-gce", "fence-agents-ibm-powervs", "fence-agents-ibm-vpc", "fence-agents-kubevirt", "fence-agents-openstack" ] }, "__ha_cluster_repos": [ { "id": "highavailability", "name": "HighAvailability" }, { "id": "resilientstorage", "name": "ResilientStorage" } ] }, "ansible_included_var_files": [ "/tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node1] => (item=CentOS_9.yml) => { "ansible_facts": { "__ha_cluster_cloud_agents_packages": { "aarch64": [ "fence-agents-ibm-powervs", "fence-agents-ibm-vpc", "fence-agents-kubevirt" ], "noarch": [ "fence-agents-ibm-powervs", "fence-agents-ibm-vpc" ], "ppc64le": [ "fence-agents-compute", "fence-agents-ibm-powervs", "fence-agents-ibm-vpc", "fence-agents-kubevirt", "fence-agents-openstack" ], "s390x": [ "fence-agents-ibm-powervs", "fence-agents-ibm-vpc", "fence-agents-kubevirt" ], "x86_64": [ "resource-agents-cloud", "fence-agents-aliyun", "fence-agents-aws", "fence-agents-azure-arm", "fence-agents-compute", "fence-agents-gce", "fence-agents-ibm-powervs", "fence-agents-ibm-vpc", "fence-agents-kubevirt", "fence-agents-openstack" ] }, "__ha_cluster_repos": [ { "id": "highavailability", "name": "HighAvailability" }, { "id": "resilientstorage", "name": "ResilientStorage" } ] }, "ansible_included_var_files": [ "/tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node1] => (item=CentOS_9.yml) => { "ansible_facts": { "__ha_cluster_cloud_agents_packages": { "aarch64": [ "fence-agents-ibm-powervs", "fence-agents-ibm-vpc", "fence-agents-kubevirt" ], "noarch": [ "fence-agents-ibm-powervs", "fence-agents-ibm-vpc" ], "ppc64le": [ "fence-agents-compute", "fence-agents-ibm-powervs", "fence-agents-ibm-vpc", "fence-agents-kubevirt", "fence-agents-openstack" ], "s390x": [ "fence-agents-ibm-powervs", "fence-agents-ibm-vpc", "fence-agents-kubevirt" ], "x86_64": [ "resource-agents-cloud", "fence-agents-aliyun", "fence-agents-aws", "fence-agents-azure-arm", "fence-agents-compute", "fence-agents-gce", "fence-agents-ibm-powervs", "fence-agents-ibm-vpc", "fence-agents-kubevirt", "fence-agents-openstack" ] }, "__ha_cluster_repos": [ { "id": "highavailability", "name": "HighAvailability" }, { "id": "resilientstorage", "name": "ResilientStorage" } ] }, "ansible_included_var_files": [ "/tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.ha_cluster : Set Linux Pacemaker shell specific variables] *** task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/set_vars.yml:42 Wednesday 20 August 2025 13:18:38 -0400 (0:00:00.167) 0:00:03.570 ****** ok: [managed-node1] => { "ansible_facts": {}, "ansible_included_var_files": [ "/tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/vars/shell_pcs.yml" ], "changed": false } TASK [fedora.linux_system_roles.ha_cluster : Enable package repositories] ****** task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/main.yml:6 Wednesday 20 August 2025 13:18:38 -0400 (0:00:00.044) 0:00:03.615 ****** included: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/enable-package-repositories.yml for managed-node1 TASK [fedora.linux_system_roles.ha_cluster : Find platform/version specific tasks to enable repositories] *** task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/enable-package-repositories.yml:3 Wednesday 20 August 2025 13:18:39 -0400 (0:00:00.073) 0:00:03.688 ****** ok: [managed-node1] => (item=RedHat.yml) => { "ansible_facts": { "__ha_cluster_enable_repo_tasks_file": "/tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/enable-repositories/RedHat.yml" }, "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml" } ok: [managed-node1] => (item=CentOS.yml) => { "ansible_facts": { "__ha_cluster_enable_repo_tasks_file": "/tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/enable-repositories/CentOS.yml" }, "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml" } skipping: [managed-node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__ha_cluster_enable_repo_tasks_file_candidate is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__ha_cluster_enable_repo_tasks_file_candidate is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ha_cluster : Run platform/version specific tasks to enable repositories] *** task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/enable-package-repositories.yml:21 Wednesday 20 August 2025 13:18:39 -0400 (0:00:00.114) 0:00:03.803 ****** included: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/enable-repositories/CentOS.yml for managed-node1 TASK [fedora.linux_system_roles.ha_cluster : List active CentOS repositories] *** task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/enable-repositories/CentOS.yml:3 Wednesday 20 August 2025 13:18:39 -0400 (0:00:00.098) 0:00:03.902 ****** ok: [managed-node1] => { "changed": false, "cmd": [ "dnf", "repolist" ], "delta": "0:00:00.197759", "end": "2025-08-20 13:18:40.304023", "rc": 0, "start": "2025-08-20 13:18:40.106264" } STDOUT: repo id repo name appstream CentOS Stream 9 - AppStream baseos CentOS Stream 9 - BaseOS beaker-client Beaker Client - RedHatEnterpriseLinux9 beaker-harness Beaker harness beakerlib-libraries Copr repo for beakerlib-libraries owned by bgoncalv copr:copr.devel.redhat.com:lpol:qa-tools Copr repo for qa-tools owned by lpol epel Extra Packages for Enterprise Linux 9 - x86_64 epel-cisco-openh264 Extra Packages for Enterprise Linux 9 openh264 (From Cisco) - x86_64 epel-debuginfo Extra Packages for Enterprise Linux 9 - x86_64 - Debug epel-source Extra Packages for Enterprise Linux 9 - x86_64 - Source extras-common CentOS Stream 9 - Extras packages highavailability CentOS Stream 9 - HighAvailability TASK [fedora.linux_system_roles.ha_cluster : Enable CentOS repositories] ******* task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/enable-repositories/CentOS.yml:10 Wednesday 20 August 2025 13:18:40 -0400 (0:00:01.069) 0:00:04.971 ****** skipping: [managed-node1] => (item={'id': 'highavailability', 'name': 'HighAvailability'}) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item.id not in __ha_cluster_repolist.stdout", "item": { "id": "highavailability", "name": "HighAvailability" }, "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item={'id': 'resilientstorage', 'name': 'ResilientStorage'}) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item.name != \"ResilientStorage\" or ha_cluster_enable_repos_resilient_storage", "item": { "id": "resilientstorage", "name": "ResilientStorage" }, "skip_reason": "Conditional result was False" } skipping: [managed-node1] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.ha_cluster : Install role essential packages] *** task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/main.yml:11 Wednesday 20 August 2025 13:18:40 -0400 (0:00:00.032) 0:00:05.004 ****** fatal: [managed-node1]: FAILED! => { "changed": false, "rc": 1, "results": [] } MSG: Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried TASK [Extract errors] ********************************************************** task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/tests/ha_cluster/tests_default.yml:19 Wednesday 20 August 2025 13:18:45 -0400 (0:00:04.744) 0:00:09.749 ****** ok: [managed-node1] => { "ansible_facts": { "error_list": [] }, "changed": false } TASK [Check errors] ************************************************************ task path: /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/tests/ha_cluster/tests_default.yml:24 Wednesday 20 August 2025 13:18:45 -0400 (0:00:00.118) 0:00:09.868 ****** fatal: [managed-node1]: FAILED! => { "assertion": "'ha_cluster_hacluster_password must be specified' in error_list", "changed": false, "evaluated_to": false } MSG: Assertion failed PLAY RECAP ********************************************************************* managed-node1 : ok=14 changed=0 unreachable=0 failed=1 skipped=6 rescued=1 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.17.13", "end_time": "2025-08-20T17:18:45.122985+00:00Z", "host": "managed-node1", "message": "Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried", "rc": 1, "start_time": "2025-08-20T17:18:40.389184+00:00Z", "task_name": "Install role essential packages", "task_path": "/tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/main.yml:11" }, { "ansible_version": "2.17.13", "end_time": "2025-08-20T17:18:45.303514+00:00Z", "host": "managed-node1", "message": "Assertion failed", "start_time": "2025-08-20T17:18:45.261661+00:00Z", "task_name": "Check errors", "task_path": "/tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/tests/ha_cluster/tests_default.yml:24" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Wednesday 20 August 2025 13:18:45 -0400 (0:00:00.052) 0:00:09.921 ****** =============================================================================== fedora.linux_system_roles.ha_cluster : Install role essential packages --- 4.75s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/main.yml:11 fedora.linux_system_roles.ha_cluster : Ensure facts used by tests ------- 1.29s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/test_setup.yml:14 fedora.linux_system_roles.ha_cluster : List active CentOS repositories --- 1.07s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/enable-repositories/CentOS.yml:3 fedora.linux_system_roles.ha_cluster : Check if system is ostree -------- 0.79s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/test_setup.yml:22 fedora.linux_system_roles.ha_cluster : Do not try to enable RHEL repositories --- 0.22s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/test_setup.yml:32 fedora.linux_system_roles.ha_cluster : Ensure ansible_facts used by role --- 0.20s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/set_vars.yml:2 Run the role ------------------------------------------------------------ 0.17s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/tests/ha_cluster/tests_default.yml:15 fedora.linux_system_roles.ha_cluster : Set platform/version specific variables --- 0.17s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/set_vars.yml:19 fedora.linux_system_roles.ha_cluster : Copy nss-altfiles ha_cluster users to /etc/passwd --- 0.14s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/test_setup.yml:41 fedora.linux_system_roles.ha_cluster : Set node name to 'localhost' for single-node clusters --- 0.14s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/test_setup.yml:9 Extract errors ---------------------------------------------------------- 0.12s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/tests/ha_cluster/tests_default.yml:19 fedora.linux_system_roles.ha_cluster : Find platform/version specific tasks to enable repositories --- 0.12s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/enable-package-repositories.yml:3 fedora.linux_system_roles.ha_cluster : Set platform/version specific variables --- 0.11s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/main.yml:3 fedora.linux_system_roles.ha_cluster : Run platform/version specific tasks to enable repositories --- 0.10s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/enable-package-repositories.yml:21 fedora.linux_system_roles.ha_cluster : Set flag to indicate system is ostree --- 0.09s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/test_setup.yml:27 fedora.linux_system_roles.ha_cluster : Set flag to indicate system is ostree --- 0.08s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/set_vars.yml:15 fedora.linux_system_roles.ha_cluster : Check if system is ostree -------- 0.08s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/set_vars.yml:10 fedora.linux_system_roles.ha_cluster : Enable package repositories ------ 0.07s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/roles/ha_cluster/tasks/main.yml:6 Check errors ------------------------------------------------------------ 0.05s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/tests/ha_cluster/tests_default.yml:24 Set up test environment ------------------------------------------------- 0.05s /tmp/collections-TBt/ansible_collections/fedora/linux_system_roles/tests/ha_cluster/tests_default.yml:10 Aug 20 13:18:36 managed-node1 python3.9[11539]: ansible-setup Invoked with gather_subset=['min'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Aug 20 13:18:37 managed-node1 python3.9[11692]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Aug 20 13:18:40 managed-node1 python3.9[11841]: ansible-ansible.legacy.command Invoked with _raw_params=dnf repolist _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Aug 20 13:18:41 managed-node1 python3.9[11991]: ansible-ansible.legacy.dnf Invoked with name=['pcs', 'corosync-qnetd', 'openssl'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Aug 20 13:18:45 managed-node1 sshd[12052]: Accepted publickey for root from 10.31.47.21 port 47226 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Aug 20 13:18:45 managed-node1 systemd-logind[602]: New session 18 of user root. ░░ Subject: A new session 18 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 18 has been created for the user root. ░░ ░░ The leading process of the session is 12052. Aug 20 13:18:45 managed-node1 systemd[1]: Started Session 18 of User root. ░░ Subject: A start job for unit session-18.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-18.scope has finished successfully. ░░ ░░ The job identifier is 1798. Aug 20 13:18:45 managed-node1 sshd[12052]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Aug 20 13:18:45 managed-node1 sshd[12055]: Received disconnect from 10.31.47.21 port 47226:11: disconnected by user Aug 20 13:18:45 managed-node1 sshd[12055]: Disconnected from user root 10.31.47.21 port 47226 Aug 20 13:18:45 managed-node1 sshd[12052]: pam_unix(sshd:session): session closed for user root Aug 20 13:18:45 managed-node1 systemd[1]: session-18.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-18.scope has successfully entered the 'dead' state. Aug 20 13:18:45 managed-node1 systemd-logind[602]: Session 18 logged out. Waiting for processes to exit. Aug 20 13:18:45 managed-node1 systemd-logind[602]: Removed session 18. ░░ Subject: Session 18 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 18 has been terminated. Aug 20 13:18:45 managed-node1 sshd[12080]: Accepted publickey for root from 10.31.47.21 port 47234 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Aug 20 13:18:45 managed-node1 systemd-logind[602]: New session 19 of user root. ░░ Subject: A new session 19 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 19 has been created for the user root. ░░ ░░ The leading process of the session is 12080. Aug 20 13:18:45 managed-node1 systemd[1]: Started Session 19 of User root. ░░ Subject: A start job for unit session-19.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-19.scope has finished successfully. ░░ ░░ The job identifier is 1867. Aug 20 13:18:45 managed-node1 sshd[12080]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0)