ansible-playbook [core 2.17.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-QnR executable location = /usr/local/bin/ansible-playbook python version = 3.12.11 (main, Aug 14 2025, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-11)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_luks2.yml ****************************************************** 1 plays in /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml PLAY [Test LUKS2] ************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:2 Saturday 04 October 2025 18:48:43 -0400 (0:00:00.319) 0:00:00.319 ****** [WARNING]: Platform linux on host managed-node8 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed-node8] TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:20 Saturday 04 October 2025 18:48:48 -0400 (0:00:04.747) 0:00:05.066 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:28 Saturday 04 October 2025 18:48:48 -0400 (0:00:00.272) 0:00:05.339 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode - 2] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:39 Saturday 04 October 2025 18:48:48 -0400 (0:00:00.194) 0:00:05.534 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot - 2] ************************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:43 Saturday 04 October 2025 18:48:48 -0400 (0:00:00.170) 0:00:05.705 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:53 Saturday 04 October 2025 18:48:48 -0400 (0:00:00.177) 0:00:05.883 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:59 Saturday 04 October 2025 18:48:48 -0400 (0:00:00.127) 0:00:06.011 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot - 3] ************************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:68 Saturday 04 October 2025 18:48:49 -0400 (0:00:00.200) 0:00:06.211 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:72 Saturday 04 October 2025 18:48:49 -0400 (0:00:00.186) 0:00:06.398 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 18:48:49 -0400 (0:00:00.449) 0:00:06.847 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 18:48:50 -0400 (0:00:00.295) 0:00:07.143 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 18:48:50 -0400 (0:00:00.465) 0:00:07.608 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 18:48:51 -0400 (0:00:00.501) 0:00:08.110 ****** ok: [managed-node8] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 18:48:53 -0400 (0:00:02.382) 0:00:10.492 ****** ok: [managed-node8] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 18:48:53 -0400 (0:00:00.339) 0:00:10.832 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 18:48:53 -0400 (0:00:00.189) 0:00:11.021 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 18:48:54 -0400 (0:00:00.193) 0:00:11.214 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 18:48:54 -0400 (0:00:00.669) 0:00:11.883 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 18:48:58 -0400 (0:00:03.753) 0:00:15.637 ****** ok: [managed-node8] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 18:48:59 -0400 (0:00:00.463) 0:00:16.101 ****** ok: [managed-node8] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 18:48:59 -0400 (0:00:00.519) 0:00:16.620 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 18:49:03 -0400 (0:00:03.700) 0:00:20.323 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 18:49:03 -0400 (0:00:00.439) 0:00:20.762 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 18:49:04 -0400 (0:00:00.386) 0:00:21.148 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 18:49:04 -0400 (0:00:00.443) 0:00:21.592 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 18:49:05 -0400 (0:00:00.468) 0:00:22.061 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 18:49:07 -0400 (0:00:02.187) 0:00:24.249 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 18:49:12 -0400 (0:00:04.823) 0:00:29.073 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 18:49:12 -0400 (0:00:00.879) 0:00:29.952 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 18:49:13 -0400 (0:00:00.260) 0:00:30.213 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Saturday 04 October 2025 18:49:14 -0400 (0:00:01.754) 0:00:31.968 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Saturday 04 October 2025 18:49:15 -0400 (0:00:00.432) 0:00:32.400 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618080.8832853, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "b4630a077d309650d3ee716aa5a9332baf46f9fe", "ctime": 1759618079.5542736, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 683671747, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1759618079.5542736, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1366, "uid": 0, "version": "2862150521", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Saturday 04 October 2025 18:49:16 -0400 (0:00:01.256) 0:00:33.656 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "blivet_output is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 18:49:16 -0400 (0:00:00.326) 0:00:33.983 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Saturday 04 October 2025 18:49:17 -0400 (0:00:00.269) 0:00:34.252 ****** ok: [managed-node8] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Saturday 04 October 2025 18:49:17 -0400 (0:00:00.324) 0:00:34.576 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Saturday 04 October 2025 18:49:17 -0400 (0:00:00.340) 0:00:34.917 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Saturday 04 October 2025 18:49:18 -0400 (0:00:00.259) 0:00:35.177 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Saturday 04 October 2025 18:49:18 -0400 (0:00:00.362) 0:00:35.540 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "blivet_output['mounts'] | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Saturday 04 October 2025 18:49:19 -0400 (0:00:00.540) 0:00:36.080 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Saturday 04 October 2025 18:49:19 -0400 (0:00:00.542) 0:00:36.622 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Saturday 04 October 2025 18:49:20 -0400 (0:00:00.560) 0:00:37.183 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "blivet_output['mounts'] | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Saturday 04 October 2025 18:49:20 -0400 (0:00:00.671) 0:00:37.854 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759616890.218089, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1759240078.0717998, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 4194436, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1759239790.5217998, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2367122887", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Saturday 04 October 2025 18:49:21 -0400 (0:00:01.165) 0:00:39.020 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Saturday 04 October 2025 18:49:22 -0400 (0:00:00.216) 0:00:39.236 ****** ok: [managed-node8] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:76 Saturday 04 October 2025 18:49:24 -0400 (0:00:02.114) 0:00:41.350 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node8 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Saturday 04 October 2025 18:49:24 -0400 (0:00:00.599) 0:00:41.950 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: util-linux-core TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Saturday 04 October 2025 18:49:27 -0400 (0:00:02.208) 0:00:44.158 ****** ok: [managed-node8] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG_SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG_SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Saturday 04 October 2025 18:49:30 -0400 (0:00:03.702) 0:00:47.860 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "'Unable to find unused disk' in unused_disks_return.disks", "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Saturday 04 October 2025 18:49:31 -0400 (0:00:00.223) 0:00:48.084 ****** ok: [managed-node8] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Saturday 04 October 2025 18:49:31 -0400 (0:00:00.265) 0:00:48.349 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "unused_disks | d([]) | length < disks_needed | d(1)", "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Saturday 04 October 2025 18:49:31 -0400 (0:00:00.567) 0:00:48.917 ****** ok: [managed-node8] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:85 Saturday 04 October 2025 18:49:32 -0400 (0:00:00.329) 0:00:49.246 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 04 October 2025 18:49:32 -0400 (0:00:00.542) 0:00:49.788 ****** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 04 October 2025 18:49:33 -0400 (0:00:00.578) 0:00:50.367 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 18:49:33 -0400 (0:00:00.495) 0:00:50.862 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 18:49:33 -0400 (0:00:00.165) 0:00:51.028 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 18:49:35 -0400 (0:00:01.372) 0:00:52.400 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 18:49:36 -0400 (0:00:00.661) 0:00:53.062 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 18:49:36 -0400 (0:00:00.271) 0:00:53.334 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 18:49:36 -0400 (0:00:00.255) 0:00:53.589 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 18:49:36 -0400 (0:00:00.248) 0:00:53.837 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 18:49:37 -0400 (0:00:00.302) 0:00:54.140 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 18:49:37 -0400 (0:00:00.710) 0:00:54.851 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 18:49:40 -0400 (0:00:02.247) 0:00:57.098 ****** ok: [managed-node8] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 18:49:40 -0400 (0:00:00.672) 0:00:57.771 ****** ok: [managed-node8] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 18:49:41 -0400 (0:00:00.688) 0:00:58.459 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 18:49:44 -0400 (0:00:02.671) 0:01:01.130 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 18:49:44 -0400 (0:00:00.661) 0:01:01.792 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 18:49:45 -0400 (0:00:00.554) 0:01:02.347 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 18:49:45 -0400 (0:00:00.519) 0:01:02.866 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 18:49:46 -0400 (0:00:00.600) 0:01:03.467 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 18:49:48 -0400 (0:00:02.229) 0:01:05.697 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 18:49:51 -0400 (0:00:02.997) 0:01:08.694 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 18:49:52 -0400 (0:00:00.740) 0:01:09.435 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 18:49:52 -0400 (0:00:00.250) 0:01:09.685 ****** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Saturday 04 October 2025 18:49:55 -0400 (0:00:02.386) 0:01:12.072 ****** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'foo' missing key/password", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 18:49:55 -0400 (0:00:00.342) 0:01:12.415 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 04 October 2025 18:49:55 -0400 (0:00:00.179) 0:01:12.595 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 04 October 2025 18:49:55 -0400 (0:00:00.288) 0:01:12.883 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 04 October 2025 18:49:56 -0400 (0:00:00.593) 0:01:13.476 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:101 Saturday 04 October 2025 18:49:56 -0400 (0:00:00.352) 0:01:13.829 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 18:49:57 -0400 (0:00:00.972) 0:01:14.802 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 18:49:58 -0400 (0:00:00.383) 0:01:15.185 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 18:49:58 -0400 (0:00:00.543) 0:01:15.729 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 18:49:59 -0400 (0:00:00.709) 0:01:16.439 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 18:49:59 -0400 (0:00:00.296) 0:01:16.735 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 18:49:59 -0400 (0:00:00.271) 0:01:17.006 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 18:50:00 -0400 (0:00:00.305) 0:01:17.311 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 18:50:00 -0400 (0:00:00.272) 0:01:17.584 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 18:50:01 -0400 (0:00:00.897) 0:01:18.481 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 18:50:03 -0400 (0:00:02.239) 0:01:20.721 ****** ok: [managed-node8] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 18:50:04 -0400 (0:00:00.715) 0:01:21.436 ****** ok: [managed-node8] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 18:50:04 -0400 (0:00:00.600) 0:01:22.037 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 18:50:07 -0400 (0:00:02.265) 0:01:24.302 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 18:50:07 -0400 (0:00:00.634) 0:01:24.936 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 18:50:08 -0400 (0:00:00.635) 0:01:25.572 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 18:50:09 -0400 (0:00:00.543) 0:01:26.115 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 18:50:09 -0400 (0:00:00.672) 0:01:26.788 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 18:50:12 -0400 (0:00:02.373) 0:01:29.162 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 18:50:14 -0400 (0:00:02.864) 0:01:32.027 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 18:50:15 -0400 (0:00:00.740) 0:01:32.767 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 18:50:15 -0400 (0:00:00.273) 0:01:33.041 ****** changed: [managed-node8] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Saturday 04 October 2025 18:50:27 -0400 (0:00:11.842) 0:01:44.884 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Saturday 04 October 2025 18:50:28 -0400 (0:00:00.466) 0:01:45.351 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618080.8832853, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "b4630a077d309650d3ee716aa5a9332baf46f9fe", "ctime": 1759618079.5542736, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 683671747, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1759618079.5542736, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1366, "uid": 0, "version": "2862150521", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Saturday 04 October 2025 18:50:29 -0400 (0:00:01.077) 0:01:46.428 ****** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 18:50:31 -0400 (0:00:02.370) 0:01:48.799 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Saturday 04 October 2025 18:50:31 -0400 (0:00:00.201) 0:01:49.000 ****** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Saturday 04 October 2025 18:50:32 -0400 (0:00:00.667) 0:01:49.667 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Saturday 04 October 2025 18:50:32 -0400 (0:00:00.265) 0:01:49.933 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Saturday 04 October 2025 18:50:33 -0400 (0:00:00.180) 0:01:50.113 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Saturday 04 October 2025 18:50:33 -0400 (0:00:00.452) 0:01:50.576 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Saturday 04 October 2025 18:50:38 -0400 (0:00:04.787) 0:01:55.363 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': '/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Saturday 04 October 2025 18:50:41 -0400 (0:00:03.430) 0:01:58.794 ****** skipping: [managed-node8] => (item={'src': '/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Saturday 04 October 2025 18:50:42 -0400 (0:00:00.572) 0:01:59.366 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Saturday 04 October 2025 18:50:44 -0400 (0:00:01.964) 0:02:01.330 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759616890.218089, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1759240078.0717998, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 4194436, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1759239790.5217998, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2367122887", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Saturday 04 October 2025 18:50:45 -0400 (0:00:01.162) 0:02:02.493 ****** changed: [managed-node8] => (item={'backing_device': '/dev/sda', 'name': 'luks-7f04d860-5fdb-4936-bbb5-ef0a46440079', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Saturday 04 October 2025 18:50:46 -0400 (0:00:01.430) 0:02:03.923 ****** ok: [managed-node8] TASK [Verify role results] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:114 Saturday 04 October 2025 18:50:48 -0400 (0:00:02.088) 0:02:06.011 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 04 October 2025 18:50:49 -0400 (0:00:00.698) 0:02:06.709 ****** skipping: [managed-node8] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 04 October 2025 18:50:50 -0400 (0:00:00.509) 0:02:07.219 ****** ok: [managed-node8] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 04 October 2025 18:50:50 -0400 (0:00:00.588) 0:02:07.807 ****** ok: [managed-node8] => { "changed": false, "info": { "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "size": "10G", "type": "crypt", "uuid": "2fd3b72f-5095-47eb-a256-73b85945fd11" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "7f04d860-5fdb-4936-bbb5-ef0a46440079" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "8f91d125-a2a0-4395-9c60-8981f3d516e2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 04 October 2025 18:50:53 -0400 (0:00:02.802) 0:02:10.610 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002814", "end": "2025-10-04 18:50:56.358660", "rc": 0, "start": "2025-10-04 18:50:56.355846" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Sep 30 13:43:10 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=8f91d125-a2a0-4395-9c60-8981f3d516e2 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 04 October 2025 18:50:56 -0400 (0:00:03.008) 0:02:13.619 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003201", "end": "2025-10-04 18:50:57.495490", "failed_when_result": false, "rc": 0, "start": "2025-10-04 18:50:57.492289" } STDOUT: luks-7f04d860-5fdb-4936-bbb5-ef0a46440079 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 04 October 2025 18:50:57 -0400 (0:00:01.148) 0:02:14.767 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 04 October 2025 18:50:58 -0400 (0:00:00.493) 0:02:15.261 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079', '_raw_device': '/dev/sda', '_mount_id': '/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 04 October 2025 18:50:59 -0400 (0:00:00.795) 0:02:16.057 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 04 October 2025 18:50:59 -0400 (0:00:00.711) 0:02:16.769 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 => (item=mount) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 => (item=fstab) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 => (item=fs) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 => (item=device) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 => (item=encryption) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 => (item=md) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 => (item=size) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 04 October 2025 18:51:01 -0400 (0:00:01.777) 0:02:18.546 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 04 October 2025 18:51:01 -0400 (0:00:00.375) 0:02:18.922 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 04 October 2025 18:51:02 -0400 (0:00:00.670) 0:02:19.593 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 04 October 2025 18:51:03 -0400 (0:00:00.750) 0:02:20.344 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 04 October 2025 18:51:03 -0400 (0:00:00.343) 0:02:20.687 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 04 October 2025 18:51:04 -0400 (0:00:00.634) 0:02:21.322 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 04 October 2025 18:51:04 -0400 (0:00:00.693) 0:02:22.015 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 04 October 2025 18:51:05 -0400 (0:00:00.599) 0:02:22.614 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 04 October 2025 18:51:05 -0400 (0:00:00.256) 0:02:22.871 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 04 October 2025 18:51:06 -0400 (0:00:00.236) 0:02:23.107 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 04 October 2025 18:51:06 -0400 (0:00:00.234) 0:02:23.341 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 04 October 2025 18:51:06 -0400 (0:00:00.244) 0:02:23.586 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 04 October 2025 18:51:07 -0400 (0:00:00.971) 0:02:24.557 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 04 October 2025 18:51:08 -0400 (0:00:00.566) 0:02:25.124 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 04 October 2025 18:51:08 -0400 (0:00:00.659) 0:02:25.784 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 04 October 2025 18:51:09 -0400 (0:00:00.617) 0:02:26.402 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 04 October 2025 18:51:09 -0400 (0:00:00.585) 0:02:26.988 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 04 October 2025 18:51:10 -0400 (0:00:00.245) 0:02:27.233 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 04 October 2025 18:51:10 -0400 (0:00:00.728) 0:02:27.962 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 04 October 2025 18:51:11 -0400 (0:00:00.612) 0:02:28.574 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618227.332565, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759618227.332565, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 452, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1759618227.332565, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 04 October 2025 18:51:12 -0400 (0:00:01.297) 0:02:29.872 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 04 October 2025 18:51:13 -0400 (0:00:00.423) 0:02:30.295 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 04 October 2025 18:51:13 -0400 (0:00:00.248) 0:02:30.543 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 04 October 2025 18:51:13 -0400 (0:00:00.271) 0:02:30.815 ****** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 04 October 2025 18:51:13 -0400 (0:00:00.223) 0:02:31.039 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 04 October 2025 18:51:14 -0400 (0:00:00.294) 0:02:31.333 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 04 October 2025 18:51:14 -0400 (0:00:00.405) 0:02:31.738 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618227.5735672, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759618227.5735672, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 990, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1759618227.5735672, "nlink": 1, "path": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 04 October 2025 18:51:15 -0400 (0:00:01.307) 0:02:33.059 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 04 October 2025 18:51:18 -0400 (0:00:02.253) 0:02:35.313 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.007084", "end": "2025-10-04 18:51:19.095955", "rc": 0, "start": "2025-10-04 18:51:19.088871" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 7f04d860-5fdb-4936-bbb5-ef0a46440079 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 690733 Threads: 2 Salt: d3 1e 20 fc e7 b3 4d f6 92 49 43 79 cf 16 dc 4d 87 cb a2 b1 eb 9a d6 f3 af 0c 9f c6 62 d6 b2 56 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 131598 Salt: 57 b1 90 a6 a7 c7 8e d7 71 df 8e 66 03 da 20 df 11 c1 0c db 70 8d 4a 94 52 a4 21 82 5e 19 97 65 Digest: 00 fa 35 16 e4 05 d2 2e b8 85 c8 f5 6a 4d e9 9c a2 c5 21 f5 b3 2d 13 43 30 c0 16 50 70 83 08 a8 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 04 October 2025 18:51:19 -0400 (0:00:01.072) 0:02:36.385 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 04 October 2025 18:51:20 -0400 (0:00:00.757) 0:02:37.142 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 04 October 2025 18:51:20 -0400 (0:00:00.552) 0:02:37.695 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 04 October 2025 18:51:21 -0400 (0:00:00.423) 0:02:38.119 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 04 October 2025 18:51:21 -0400 (0:00:00.812) 0:02:38.931 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 04 October 2025 18:51:22 -0400 (0:00:00.832) 0:02:39.763 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 04 October 2025 18:51:22 -0400 (0:00:00.272) 0:02:40.036 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 04 October 2025 18:51:23 -0400 (0:00:00.268) 0:02:40.305 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-7f04d860-5fdb-4936-bbb5-ef0a46440079 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 04 October 2025 18:51:23 -0400 (0:00:00.712) 0:02:41.018 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 04 October 2025 18:51:24 -0400 (0:00:00.677) 0:02:41.695 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 04 October 2025 18:51:25 -0400 (0:00:00.744) 0:02:42.440 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 04 October 2025 18:51:26 -0400 (0:00:00.614) 0:02:43.055 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 04 October 2025 18:51:26 -0400 (0:00:00.676) 0:02:43.731 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 04 October 2025 18:51:26 -0400 (0:00:00.254) 0:02:43.985 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 04 October 2025 18:51:27 -0400 (0:00:00.252) 0:02:44.238 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 04 October 2025 18:51:27 -0400 (0:00:00.230) 0:02:44.468 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 04 October 2025 18:51:27 -0400 (0:00:00.297) 0:02:44.765 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 04 October 2025 18:51:27 -0400 (0:00:00.249) 0:02:45.014 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 04 October 2025 18:51:28 -0400 (0:00:00.268) 0:02:45.283 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 04 October 2025 18:51:28 -0400 (0:00:00.222) 0:02:45.506 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 04 October 2025 18:51:28 -0400 (0:00:00.222) 0:02:45.728 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 04 October 2025 18:51:28 -0400 (0:00:00.244) 0:02:45.973 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 04 October 2025 18:51:29 -0400 (0:00:00.292) 0:02:46.286 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 04 October 2025 18:51:29 -0400 (0:00:00.236) 0:02:46.523 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 04 October 2025 18:51:30 -0400 (0:00:00.557) 0:02:47.081 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 04 October 2025 18:51:30 -0400 (0:00:00.587) 0:02:47.669 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 04 October 2025 18:51:31 -0400 (0:00:00.641) 0:02:48.310 ****** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 04 October 2025 18:51:31 -0400 (0:00:00.257) 0:02:48.567 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 04 October 2025 18:51:32 -0400 (0:00:00.587) 0:02:49.155 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 04 October 2025 18:51:32 -0400 (0:00:00.590) 0:02:49.745 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 04 October 2025 18:51:33 -0400 (0:00:00.641) 0:02:50.386 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 04 October 2025 18:51:33 -0400 (0:00:00.524) 0:02:50.910 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 04 October 2025 18:51:34 -0400 (0:00:00.477) 0:02:51.388 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 04 October 2025 18:51:34 -0400 (0:00:00.308) 0:02:51.696 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 04 October 2025 18:51:35 -0400 (0:00:00.398) 0:02:52.095 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 04 October 2025 18:51:35 -0400 (0:00:00.297) 0:02:52.392 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 04 October 2025 18:51:35 -0400 (0:00:00.234) 0:02:52.627 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 04 October 2025 18:51:35 -0400 (0:00:00.241) 0:02:52.869 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 04 October 2025 18:51:36 -0400 (0:00:00.264) 0:02:53.133 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 04 October 2025 18:51:36 -0400 (0:00:00.329) 0:02:53.470 ****** skipping: [managed-node8] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 04 October 2025 18:51:36 -0400 (0:00:00.325) 0:02:53.796 ****** skipping: [managed-node8] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 04 October 2025 18:51:37 -0400 (0:00:00.276) 0:02:54.072 ****** skipping: [managed-node8] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 04 October 2025 18:51:37 -0400 (0:00:00.255) 0:02:54.327 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 04 October 2025 18:51:37 -0400 (0:00:00.266) 0:02:54.594 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 04 October 2025 18:51:37 -0400 (0:00:00.203) 0:02:54.797 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 04 October 2025 18:51:38 -0400 (0:00:00.311) 0:02:55.109 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 04 October 2025 18:51:38 -0400 (0:00:00.374) 0:02:55.484 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 04 October 2025 18:51:38 -0400 (0:00:00.307) 0:02:55.791 ****** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 04 October 2025 18:51:39 -0400 (0:00:00.303) 0:02:56.095 ****** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 04 October 2025 18:51:39 -0400 (0:00:00.254) 0:02:56.349 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 04 October 2025 18:51:39 -0400 (0:00:00.451) 0:02:56.801 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 04 October 2025 18:51:40 -0400 (0:00:00.286) 0:02:57.088 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 04 October 2025 18:51:40 -0400 (0:00:00.283) 0:02:57.371 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 04 October 2025 18:51:40 -0400 (0:00:00.222) 0:02:57.594 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 04 October 2025 18:51:40 -0400 (0:00:00.269) 0:02:57.864 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 04 October 2025 18:51:41 -0400 (0:00:00.236) 0:02:58.100 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 04 October 2025 18:51:41 -0400 (0:00:00.253) 0:02:58.353 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 04 October 2025 18:51:41 -0400 (0:00:00.307) 0:02:58.660 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 04 October 2025 18:51:41 -0400 (0:00:00.348) 0:02:59.008 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 04 October 2025 18:51:42 -0400 (0:00:00.255) 0:02:59.264 ****** changed: [managed-node8] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:120 Saturday 04 October 2025 18:51:44 -0400 (0:00:02.573) 0:03:01.838 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 04 October 2025 18:51:45 -0400 (0:00:00.696) 0:03:02.534 ****** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 04 October 2025 18:51:46 -0400 (0:00:00.695) 0:03:03.229 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 18:51:46 -0400 (0:00:00.530) 0:03:03.760 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 18:51:47 -0400 (0:00:00.417) 0:03:04.177 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 18:51:48 -0400 (0:00:01.086) 0:03:05.264 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 18:51:49 -0400 (0:00:01.208) 0:03:06.472 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 18:51:49 -0400 (0:00:00.371) 0:03:06.844 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 18:51:50 -0400 (0:00:00.374) 0:03:07.218 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 18:51:50 -0400 (0:00:00.294) 0:03:07.513 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 18:51:50 -0400 (0:00:00.280) 0:03:07.794 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 18:51:51 -0400 (0:00:00.822) 0:03:08.617 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 18:51:53 -0400 (0:00:02.419) 0:03:11.036 ****** ok: [managed-node8] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 18:51:54 -0400 (0:00:00.674) 0:03:11.711 ****** ok: [managed-node8] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 18:51:55 -0400 (0:00:00.583) 0:03:12.294 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 18:51:57 -0400 (0:00:02.403) 0:03:14.698 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 18:51:58 -0400 (0:00:00.603) 0:03:15.301 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 18:51:58 -0400 (0:00:00.451) 0:03:15.753 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 18:51:59 -0400 (0:00:00.541) 0:03:16.295 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 18:51:59 -0400 (0:00:00.551) 0:03:16.846 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 18:52:02 -0400 (0:00:02.420) 0:03:19.266 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 18:52:06 -0400 (0:00:03.908) 0:03:23.175 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 18:52:06 -0400 (0:00:00.799) 0:03:23.974 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 18:52:07 -0400 (0:00:00.198) 0:03:24.173 ****** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-7f04d860-5fdb-4936-bbb5-ef0a46440079' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Saturday 04 October 2025 18:52:09 -0400 (0:00:02.284) 0:03:26.457 ****** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-7f04d860-5fdb-4936-bbb5-ef0a46440079' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 18:52:09 -0400 (0:00:00.330) 0:03:26.788 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 04 October 2025 18:52:09 -0400 (0:00:00.206) 0:03:26.994 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 04 October 2025 18:52:10 -0400 (0:00:00.292) 0:03:27.287 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 04 October 2025 18:52:10 -0400 (0:00:00.467) 0:03:27.756 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 04 October 2025 18:52:11 -0400 (0:00:00.364) 0:03:28.121 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618304.6092403, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1759618304.6092403, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1759618304.6092403, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3189675058", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 04 October 2025 18:52:12 -0400 (0:00:01.164) 0:03:29.285 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:141 Saturday 04 October 2025 18:52:12 -0400 (0:00:00.275) 0:03:29.560 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 18:52:13 -0400 (0:00:00.858) 0:03:30.419 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 18:52:13 -0400 (0:00:00.394) 0:03:30.814 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 18:52:14 -0400 (0:00:00.603) 0:03:31.417 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 18:52:15 -0400 (0:00:00.644) 0:03:32.061 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 18:52:15 -0400 (0:00:00.280) 0:03:32.342 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 18:52:15 -0400 (0:00:00.318) 0:03:32.661 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 18:52:15 -0400 (0:00:00.217) 0:03:32.879 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 18:52:16 -0400 (0:00:00.220) 0:03:33.099 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 18:52:16 -0400 (0:00:00.784) 0:03:33.884 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 18:52:19 -0400 (0:00:02.226) 0:03:36.110 ****** ok: [managed-node8] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 18:52:19 -0400 (0:00:00.555) 0:03:36.666 ****** ok: [managed-node8] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 18:52:20 -0400 (0:00:00.591) 0:03:37.257 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 18:52:22 -0400 (0:00:02.396) 0:03:39.654 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 18:52:23 -0400 (0:00:00.605) 0:03:40.259 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 18:52:23 -0400 (0:00:00.602) 0:03:40.861 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 18:52:24 -0400 (0:00:00.634) 0:03:41.496 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 18:52:24 -0400 (0:00:00.538) 0:03:42.034 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 18:52:27 -0400 (0:00:02.294) 0:03:44.329 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 18:52:30 -0400 (0:00:02.988) 0:03:47.317 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 18:52:31 -0400 (0:00:00.850) 0:03:48.168 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 18:52:31 -0400 (0:00:00.223) 0:03:48.392 ****** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Saturday 04 October 2025 18:52:34 -0400 (0:00:02.751) 0:03:51.144 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Saturday 04 October 2025 18:52:34 -0400 (0:00:00.463) 0:03:51.607 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618241.5586894, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "c96013176d2bd6540ce498b174232a9812b25641", "ctime": 1759618241.5546894, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 683671747, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1759618241.5546894, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1448, "uid": 0, "version": "2862150521", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Saturday 04 October 2025 18:52:35 -0400 (0:00:01.245) 0:03:52.852 ****** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 18:52:37 -0400 (0:00:02.056) 0:03:54.908 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Saturday 04 October 2025 18:52:38 -0400 (0:00:00.176) 0:03:55.084 ****** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Saturday 04 October 2025 18:52:38 -0400 (0:00:00.402) 0:03:55.486 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Saturday 04 October 2025 18:52:38 -0400 (0:00:00.351) 0:03:55.838 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Saturday 04 October 2025 18:52:39 -0400 (0:00:00.392) 0:03:56.231 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': '/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-7f04d860-5fdb-4936-bbb5-ef0a46440079" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Saturday 04 October 2025 18:52:41 -0400 (0:00:01.836) 0:03:58.068 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Saturday 04 October 2025 18:52:43 -0400 (0:00:02.173) 0:04:00.241 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': 'UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Saturday 04 October 2025 18:52:44 -0400 (0:00:01.599) 0:04:01.841 ****** skipping: [managed-node8] => (item={'src': 'UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Saturday 04 October 2025 18:52:45 -0400 (0:00:00.691) 0:04:02.532 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Saturday 04 October 2025 18:52:47 -0400 (0:00:01.964) 0:04:04.497 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618257.4948287, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "2c8fdb2ae20f013afaad3687353258dea29f2b31", "ctime": 1759618246.659734, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 314573002, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1759618246.660931, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "125980861", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Saturday 04 October 2025 18:52:48 -0400 (0:00:01.283) 0:04:05.780 ****** changed: [managed-node8] => (item={'backing_device': '/dev/sda', 'name': 'luks-7f04d860-5fdb-4936-bbb5-ef0a46440079', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Saturday 04 October 2025 18:52:50 -0400 (0:00:01.765) 0:04:07.546 ****** ok: [managed-node8] TASK [Verify role results - 2] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:155 Saturday 04 October 2025 18:52:52 -0400 (0:00:02.123) 0:04:09.669 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 04 October 2025 18:52:53 -0400 (0:00:00.645) 0:04:10.315 ****** skipping: [managed-node8] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 04 October 2025 18:52:53 -0400 (0:00:00.662) 0:04:10.977 ****** ok: [managed-node8] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 04 October 2025 18:52:54 -0400 (0:00:00.767) 0:04:11.745 ****** ok: [managed-node8] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "7b90dc69-5a01-4f7e-bb7c-684fc122d8d4" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "8f91d125-a2a0-4395-9c60-8981f3d516e2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 04 October 2025 18:52:55 -0400 (0:00:01.235) 0:04:12.981 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002533", "end": "2025-10-04 18:52:56.785931", "rc": 0, "start": "2025-10-04 18:52:56.783398" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Sep 30 13:43:10 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=8f91d125-a2a0-4395-9c60-8981f3d516e2 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 04 October 2025 18:52:57 -0400 (0:00:01.107) 0:04:14.089 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002481", "end": "2025-10-04 18:52:57.964821", "failed_when_result": false, "rc": 0, "start": "2025-10-04 18:52:57.962340" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 04 October 2025 18:52:58 -0400 (0:00:01.146) 0:04:15.235 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 04 October 2025 18:52:58 -0400 (0:00:00.550) 0:04:15.786 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4', '_kernel_device': '/dev/sda', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 04 October 2025 18:52:59 -0400 (0:00:00.850) 0:04:16.636 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 04 October 2025 18:53:00 -0400 (0:00:00.672) 0:04:17.308 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 => (item=mount) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 => (item=fstab) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 => (item=fs) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 => (item=device) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 => (item=encryption) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 => (item=md) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 => (item=size) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 04 October 2025 18:53:02 -0400 (0:00:01.770) 0:04:19.078 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 04 October 2025 18:53:02 -0400 (0:00:00.356) 0:04:19.435 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 04 October 2025 18:53:03 -0400 (0:00:00.621) 0:04:20.057 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 04 October 2025 18:53:03 -0400 (0:00:00.809) 0:04:20.866 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 04 October 2025 18:53:04 -0400 (0:00:00.398) 0:04:21.265 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 04 October 2025 18:53:04 -0400 (0:00:00.637) 0:04:21.903 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 04 October 2025 18:53:05 -0400 (0:00:00.647) 0:04:22.551 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 04 October 2025 18:53:06 -0400 (0:00:00.686) 0:04:23.238 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 04 October 2025 18:53:06 -0400 (0:00:00.283) 0:04:23.521 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 04 October 2025 18:53:06 -0400 (0:00:00.338) 0:04:23.860 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 04 October 2025 18:53:07 -0400 (0:00:00.325) 0:04:24.185 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 04 October 2025 18:53:07 -0400 (0:00:00.264) 0:04:24.449 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 04 October 2025 18:53:08 -0400 (0:00:01.042) 0:04:25.491 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 04 October 2025 18:53:09 -0400 (0:00:00.567) 0:04:26.059 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 04 October 2025 18:53:09 -0400 (0:00:00.698) 0:04:26.758 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 04 October 2025 18:53:10 -0400 (0:00:00.623) 0:04:27.382 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 04 October 2025 18:53:11 -0400 (0:00:00.712) 0:04:28.094 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 04 October 2025 18:53:11 -0400 (0:00:00.260) 0:04:28.355 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 04 October 2025 18:53:11 -0400 (0:00:00.668) 0:04:29.023 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 04 October 2025 18:53:12 -0400 (0:00:00.658) 0:04:29.681 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618353.7846699, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759618353.7846699, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 452, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1759618353.7846699, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 04 October 2025 18:53:13 -0400 (0:00:01.293) 0:04:30.975 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 04 October 2025 18:53:14 -0400 (0:00:00.384) 0:04:31.360 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 04 October 2025 18:53:14 -0400 (0:00:00.219) 0:04:31.579 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 04 October 2025 18:53:14 -0400 (0:00:00.297) 0:04:31.877 ****** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 04 October 2025 18:53:15 -0400 (0:00:00.270) 0:04:32.147 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 04 October 2025 18:53:15 -0400 (0:00:00.241) 0:04:32.389 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 04 October 2025 18:53:15 -0400 (0:00:00.315) 0:04:32.704 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 04 October 2025 18:53:15 -0400 (0:00:00.308) 0:04:33.013 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 04 October 2025 18:53:18 -0400 (0:00:02.081) 0:04:35.094 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 04 October 2025 18:53:18 -0400 (0:00:00.204) 0:04:35.299 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 04 October 2025 18:53:18 -0400 (0:00:00.224) 0:04:35.523 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 04 October 2025 18:53:19 -0400 (0:00:00.594) 0:04:36.118 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 04 October 2025 18:53:19 -0400 (0:00:00.260) 0:04:36.379 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 04 October 2025 18:53:19 -0400 (0:00:00.286) 0:04:36.665 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 04 October 2025 18:53:20 -0400 (0:00:00.927) 0:04:37.593 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 04 October 2025 18:53:20 -0400 (0:00:00.250) 0:04:37.844 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 04 October 2025 18:53:21 -0400 (0:00:00.247) 0:04:38.091 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 04 October 2025 18:53:21 -0400 (0:00:00.618) 0:04:38.710 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 04 October 2025 18:53:22 -0400 (0:00:00.584) 0:04:39.295 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 04 October 2025 18:53:22 -0400 (0:00:00.588) 0:04:39.884 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 04 October 2025 18:53:23 -0400 (0:00:00.769) 0:04:40.653 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 04 October 2025 18:53:24 -0400 (0:00:00.647) 0:04:41.300 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 04 October 2025 18:53:24 -0400 (0:00:00.252) 0:04:41.553 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 04 October 2025 18:53:24 -0400 (0:00:00.208) 0:04:41.761 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 04 October 2025 18:53:24 -0400 (0:00:00.285) 0:04:42.047 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 04 October 2025 18:53:25 -0400 (0:00:00.212) 0:04:42.259 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 04 October 2025 18:53:25 -0400 (0:00:00.264) 0:04:42.524 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 04 October 2025 18:53:25 -0400 (0:00:00.193) 0:04:42.718 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 04 October 2025 18:53:25 -0400 (0:00:00.279) 0:04:42.998 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 04 October 2025 18:53:26 -0400 (0:00:00.188) 0:04:43.186 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 04 October 2025 18:53:26 -0400 (0:00:00.214) 0:04:43.401 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 04 October 2025 18:53:26 -0400 (0:00:00.226) 0:04:43.627 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 04 October 2025 18:53:26 -0400 (0:00:00.259) 0:04:43.887 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 04 October 2025 18:53:27 -0400 (0:00:00.560) 0:04:44.448 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 04 October 2025 18:53:27 -0400 (0:00:00.510) 0:04:44.958 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 04 October 2025 18:53:28 -0400 (0:00:00.542) 0:04:45.501 ****** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 04 October 2025 18:53:28 -0400 (0:00:00.245) 0:04:45.747 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 04 October 2025 18:53:29 -0400 (0:00:00.691) 0:04:46.439 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 04 October 2025 18:53:30 -0400 (0:00:00.749) 0:04:47.189 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 04 October 2025 18:53:30 -0400 (0:00:00.563) 0:04:47.752 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 04 October 2025 18:53:31 -0400 (0:00:00.479) 0:04:48.232 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 04 October 2025 18:53:31 -0400 (0:00:00.525) 0:04:48.757 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 04 October 2025 18:53:32 -0400 (0:00:00.316) 0:04:49.074 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 04 October 2025 18:53:32 -0400 (0:00:00.337) 0:04:49.411 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 04 October 2025 18:53:32 -0400 (0:00:00.360) 0:04:49.771 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 04 October 2025 18:53:33 -0400 (0:00:00.297) 0:04:50.069 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 04 October 2025 18:53:33 -0400 (0:00:00.347) 0:04:50.416 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 04 October 2025 18:53:33 -0400 (0:00:00.388) 0:04:50.804 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 04 October 2025 18:53:34 -0400 (0:00:00.325) 0:04:51.130 ****** skipping: [managed-node8] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 04 October 2025 18:53:34 -0400 (0:00:00.320) 0:04:51.450 ****** skipping: [managed-node8] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 04 October 2025 18:53:34 -0400 (0:00:00.425) 0:04:51.876 ****** skipping: [managed-node8] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 04 October 2025 18:53:35 -0400 (0:00:00.354) 0:04:52.231 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 04 October 2025 18:53:35 -0400 (0:00:00.318) 0:04:52.549 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 04 October 2025 18:53:35 -0400 (0:00:00.280) 0:04:52.829 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 04 October 2025 18:53:36 -0400 (0:00:00.313) 0:04:53.142 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 04 October 2025 18:53:36 -0400 (0:00:00.312) 0:04:53.455 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 04 October 2025 18:53:36 -0400 (0:00:00.461) 0:04:53.916 ****** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 04 October 2025 18:53:37 -0400 (0:00:00.239) 0:04:54.155 ****** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 04 October 2025 18:53:37 -0400 (0:00:00.234) 0:04:54.390 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 04 October 2025 18:53:37 -0400 (0:00:00.497) 0:04:54.888 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 04 October 2025 18:53:38 -0400 (0:00:00.247) 0:04:55.136 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 04 October 2025 18:53:38 -0400 (0:00:00.238) 0:04:55.375 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 04 October 2025 18:53:38 -0400 (0:00:00.306) 0:04:55.684 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 04 October 2025 18:53:38 -0400 (0:00:00.281) 0:04:55.966 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 04 October 2025 18:53:39 -0400 (0:00:00.232) 0:04:56.199 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 04 October 2025 18:53:39 -0400 (0:00:00.262) 0:04:56.462 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 04 October 2025 18:53:39 -0400 (0:00:00.246) 0:04:56.708 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 04 October 2025 18:53:39 -0400 (0:00:00.194) 0:04:56.903 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 04 October 2025 18:53:40 -0400 (0:00:00.333) 0:04:57.236 ****** changed: [managed-node8] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 2] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:161 Saturday 04 October 2025 18:53:41 -0400 (0:00:01.218) 0:04:58.455 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 04 October 2025 18:53:42 -0400 (0:00:00.674) 0:04:59.129 ****** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 04 October 2025 18:53:42 -0400 (0:00:00.564) 0:04:59.694 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 18:53:43 -0400 (0:00:00.505) 0:05:00.199 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 18:53:43 -0400 (0:00:00.386) 0:05:00.585 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 18:53:44 -0400 (0:00:00.484) 0:05:01.070 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 18:53:44 -0400 (0:00:00.762) 0:05:01.832 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 18:53:45 -0400 (0:00:00.918) 0:05:02.751 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 18:53:46 -0400 (0:00:00.314) 0:05:03.065 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 18:53:46 -0400 (0:00:00.324) 0:05:03.391 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 18:53:46 -0400 (0:00:00.283) 0:05:03.675 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 18:53:47 -0400 (0:00:00.873) 0:05:04.548 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 18:53:49 -0400 (0:00:02.445) 0:05:06.993 ****** ok: [managed-node8] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 18:53:50 -0400 (0:00:00.679) 0:05:07.673 ****** ok: [managed-node8] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 18:53:51 -0400 (0:00:00.579) 0:05:08.253 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 18:53:53 -0400 (0:00:02.377) 0:05:10.630 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 18:53:53 -0400 (0:00:00.421) 0:05:11.051 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 18:53:54 -0400 (0:00:00.416) 0:05:11.468 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 18:53:54 -0400 (0:00:00.530) 0:05:11.999 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 18:53:55 -0400 (0:00:00.597) 0:05:12.597 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 18:53:57 -0400 (0:00:02.022) 0:05:14.619 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d7f04d860\\x2d5fdb\\x2d4936\\x2dbbb5\\x2def0a46440079.service": { "name": "systemd-cryptsetup@luks\\x2d7f04d860\\x2d5fdb\\x2d4936\\x2dbbb5\\x2def0a46440079.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 18:54:00 -0400 (0:00:03.008) 0:05:17.628 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d7f04d860\\x2d5fdb\\x2d4936\\x2dbbb5\\x2def0a46440079.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 18:54:01 -0400 (0:00:00.835) 0:05:18.463 ****** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d7f04d860\x2d5fdb\x2d4936\x2dbbb5\x2def0a46440079.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7f04d860\\x2d5fdb\\x2d4936\\x2dbbb5\\x2def0a46440079.service", "name": "systemd-cryptsetup@luks\\x2d7f04d860\\x2d5fdb\\x2d4936\\x2dbbb5\\x2def0a46440079.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\" cryptsetup-pre.target dev-sda.device systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target \"blockdev@dev-mapper-luks\\\\x2d7f04d860\\\\x2d5fdb\\\\x2d4936\\\\x2dbbb5\\\\x2def0a46440079.target\" cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-7f04d860-5fdb-4936-bbb5-ef0a46440079", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-7f04d860-5fdb-4936-bbb5-ef0a46440079 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-7f04d860-5fdb-4936-bbb5-ef0a46440079 /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-7f04d860-5fdb-4936-bbb5-ef0a46440079 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-7f04d860-5fdb-4936-bbb5-ef0a46440079 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d7f04d860\\x2d5fdb\\x2d4936\\x2dbbb5\\x2def0a46440079.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d7f04d860\\x2d5fdb\\x2d4936\\x2dbbb5\\x2def0a46440079.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13686", "LimitNPROCSoft": "13686", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13686", "LimitSIGPENDINGSoft": "13686", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d7f04d860\\\\x2d5fdb\\\\x2d4936\\\\x2dbbb5\\\\x2def0a46440079.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2025-10-04 18:52:47 EDT", "StateChangeTimestampMonotonic": "1916551085", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21897", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d7f04d860\\\\x2d5fdb\\\\x2d4936\\\\x2dbbb5\\\\x2def0a46440079.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 18:54:03 -0400 (0:00:01.612) 0:05:20.076 ****** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Saturday 04 October 2025 18:54:05 -0400 (0:00:02.281) 0:05:22.357 ****** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 18:54:05 -0400 (0:00:00.488) 0:05:22.846 ****** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d7f04d860\x2d5fdb\x2d4936\x2dbbb5\x2def0a46440079.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7f04d860\\x2d5fdb\\x2d4936\\x2dbbb5\\x2def0a46440079.service", "name": "systemd-cryptsetup@luks\\x2d7f04d860\\x2d5fdb\\x2d4936\\x2dbbb5\\x2def0a46440079.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d7f04d860\\x2d5fdb\\x2d4936\\x2dbbb5\\x2def0a46440079.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d7f04d860\\x2d5fdb\\x2d4936\\x2dbbb5\\x2def0a46440079.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d7f04d860\\x2d5fdb\\x2d4936\\x2dbbb5\\x2def0a46440079.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454148096", "LimitMEMLOCKSoft": "454148096", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13686", "LimitNPROCSoft": "13686", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13686", "LimitSIGPENDINGSoft": "13686", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d7f04d860\\x2d5fdb\\x2d4936\\x2dbbb5\\x2def0a46440079.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d7f04d860\\\\x2d5fdb\\\\x2d4936\\\\x2dbbb5\\\\x2def0a46440079.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21897", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 04 October 2025 18:54:07 -0400 (0:00:01.486) 0:05:24.334 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 04 October 2025 18:54:07 -0400 (0:00:00.375) 0:05:24.710 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 04 October 2025 18:54:08 -0400 (0:00:00.388) 0:05:25.098 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 04 October 2025 18:54:08 -0400 (0:00:00.361) 0:05:25.459 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618421.160258, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1759618421.160258, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1759618421.160258, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1312279138", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 04 October 2025 18:54:09 -0400 (0:00:01.174) 0:05:26.634 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:182 Saturday 04 October 2025 18:54:09 -0400 (0:00:00.401) 0:05:27.036 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 18:54:11 -0400 (0:00:01.357) 0:05:28.393 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 18:54:11 -0400 (0:00:00.516) 0:05:28.909 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 18:54:12 -0400 (0:00:00.599) 0:05:29.508 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 18:54:13 -0400 (0:00:00.714) 0:05:30.223 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 18:54:13 -0400 (0:00:00.403) 0:05:30.626 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 18:54:13 -0400 (0:00:00.287) 0:05:30.913 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 18:54:14 -0400 (0:00:00.277) 0:05:31.190 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 18:54:14 -0400 (0:00:00.299) 0:05:31.491 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 18:54:15 -0400 (0:00:00.931) 0:05:32.422 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 18:54:17 -0400 (0:00:02.092) 0:05:34.515 ****** ok: [managed-node8] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 18:54:18 -0400 (0:00:00.620) 0:05:35.135 ****** ok: [managed-node8] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 18:54:18 -0400 (0:00:00.729) 0:05:35.864 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 18:54:20 -0400 (0:00:02.013) 0:05:37.878 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 18:54:21 -0400 (0:00:00.680) 0:05:38.559 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 18:54:22 -0400 (0:00:00.653) 0:05:39.212 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 18:54:22 -0400 (0:00:00.497) 0:05:39.709 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 18:54:23 -0400 (0:00:00.571) 0:05:40.281 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 18:54:25 -0400 (0:00:02.316) 0:05:42.598 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 18:54:28 -0400 (0:00:02.799) 0:05:45.397 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 18:54:29 -0400 (0:00:00.909) 0:05:46.307 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 18:54:29 -0400 (0:00:00.181) 0:05:46.489 ****** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-888b131a-655c-4527-921b-02a5849fd56f", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Saturday 04 October 2025 18:54:42 -0400 (0:00:12.831) 0:05:59.321 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Saturday 04 October 2025 18:54:42 -0400 (0:00:00.600) 0:05:59.921 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618364.574764, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "0049b8e5b470b280c6fded4d4873fafd34877f5e", "ctime": 1759618364.571764, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 683671747, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1759618364.571764, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1436, "uid": 0, "version": "2862150521", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Saturday 04 October 2025 18:54:44 -0400 (0:00:01.313) 0:06:01.234 ****** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 18:54:46 -0400 (0:00:02.026) 0:06:03.261 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Saturday 04 October 2025 18:54:46 -0400 (0:00:00.185) 0:06:03.446 ****** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-888b131a-655c-4527-921b-02a5849fd56f", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Saturday 04 October 2025 18:54:46 -0400 (0:00:00.359) 0:06:03.805 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Saturday 04 October 2025 18:54:47 -0400 (0:00:00.345) 0:06:04.150 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Saturday 04 October 2025 18:54:47 -0400 (0:00:00.344) 0:06:04.495 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': 'UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=7b90dc69-5a01-4f7e-bb7c-684fc122d8d4" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Saturday 04 October 2025 18:54:49 -0400 (0:00:01.700) 0:06:06.195 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Saturday 04 October 2025 18:54:51 -0400 (0:00:01.972) 0:06:08.168 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': '/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Saturday 04 October 2025 18:54:52 -0400 (0:00:01.730) 0:06:09.899 ****** skipping: [managed-node8] => (item={'src': '/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Saturday 04 October 2025 18:54:53 -0400 (0:00:00.637) 0:06:10.536 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Saturday 04 October 2025 18:54:55 -0400 (0:00:01.967) 0:06:12.504 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618377.963881, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1759618370.2618136, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 595591399, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1759618370.2626297, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2515481354", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Saturday 04 October 2025 18:54:56 -0400 (0:00:01.186) 0:06:13.691 ****** changed: [managed-node8] => (item={'backing_device': '/dev/sda', 'name': 'luks-888b131a-655c-4527-921b-02a5849fd56f', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-888b131a-655c-4527-921b-02a5849fd56f", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Saturday 04 October 2025 18:54:58 -0400 (0:00:01.771) 0:06:15.462 ****** ok: [managed-node8] TASK [Verify role results - 3] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:196 Saturday 04 October 2025 18:55:00 -0400 (0:00:02.191) 0:06:17.654 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 04 October 2025 18:55:01 -0400 (0:00:01.036) 0:06:18.691 ****** skipping: [managed-node8] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 04 October 2025 18:55:02 -0400 (0:00:00.607) 0:06:19.298 ****** ok: [managed-node8] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 04 October 2025 18:55:02 -0400 (0:00:00.553) 0:06:19.851 ****** ok: [managed-node8] => { "changed": false, "info": { "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "size": "10G", "type": "crypt", "uuid": "977e1552-8d25-4a26-a2e7-99c5c353fec0" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "888b131a-655c-4527-921b-02a5849fd56f" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "8f91d125-a2a0-4395-9c60-8981f3d516e2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 04 October 2025 18:55:04 -0400 (0:00:01.235) 0:06:21.086 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.003692", "end": "2025-10-04 18:55:05.975265", "rc": 0, "start": "2025-10-04 18:55:04.971573" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Sep 30 13:43:10 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=8f91d125-a2a0-4395-9c60-8981f3d516e2 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 04 October 2025 18:55:06 -0400 (0:00:02.158) 0:06:23.245 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003384", "end": "2025-10-04 18:55:07.125087", "failed_when_result": false, "rc": 0, "start": "2025-10-04 18:55:07.121703" } STDOUT: luks-888b131a-655c-4527-921b-02a5849fd56f /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 04 October 2025 18:55:07 -0400 (0:00:01.148) 0:06:24.394 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 04 October 2025 18:55:08 -0400 (0:00:00.671) 0:06:25.066 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f', '_raw_device': '/dev/sda', '_mount_id': '/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 04 October 2025 18:55:08 -0400 (0:00:00.850) 0:06:25.916 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 04 October 2025 18:55:09 -0400 (0:00:00.654) 0:06:26.571 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 => (item=mount) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 => (item=fstab) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 => (item=fs) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 => (item=device) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 => (item=encryption) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 => (item=md) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 => (item=size) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 04 October 2025 18:55:11 -0400 (0:00:01.860) 0:06:28.431 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 04 October 2025 18:55:11 -0400 (0:00:00.478) 0:06:28.909 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 04 October 2025 18:55:12 -0400 (0:00:00.760) 0:06:29.669 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 04 October 2025 18:55:13 -0400 (0:00:00.802) 0:06:30.472 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 04 October 2025 18:55:13 -0400 (0:00:00.406) 0:06:30.878 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 04 October 2025 18:55:14 -0400 (0:00:00.731) 0:06:31.610 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 04 October 2025 18:55:15 -0400 (0:00:00.705) 0:06:32.315 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 04 October 2025 18:55:15 -0400 (0:00:00.694) 0:06:33.009 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 04 October 2025 18:55:16 -0400 (0:00:00.267) 0:06:33.277 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 04 October 2025 18:55:16 -0400 (0:00:00.241) 0:06:33.518 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 04 October 2025 18:55:16 -0400 (0:00:00.233) 0:06:33.752 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 04 October 2025 18:55:16 -0400 (0:00:00.290) 0:06:34.043 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 04 October 2025 18:55:18 -0400 (0:00:01.088) 0:06:35.131 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 04 October 2025 18:55:18 -0400 (0:00:00.751) 0:06:35.883 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 04 October 2025 18:55:19 -0400 (0:00:00.667) 0:06:36.551 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 04 October 2025 18:55:20 -0400 (0:00:00.568) 0:06:37.119 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 04 October 2025 18:55:20 -0400 (0:00:00.734) 0:06:37.854 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 04 October 2025 18:55:21 -0400 (0:00:00.323) 0:06:38.177 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 04 October 2025 18:55:21 -0400 (0:00:00.729) 0:06:38.907 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 04 October 2025 18:55:22 -0400 (0:00:00.691) 0:06:39.628 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618481.6957865, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759618481.6957865, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 452, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1759618481.6957865, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 04 October 2025 18:55:23 -0400 (0:00:01.181) 0:06:40.809 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 04 October 2025 18:55:24 -0400 (0:00:00.401) 0:06:41.210 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 04 October 2025 18:55:24 -0400 (0:00:00.290) 0:06:41.501 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 04 October 2025 18:55:24 -0400 (0:00:00.355) 0:06:41.857 ****** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 04 October 2025 18:55:25 -0400 (0:00:00.282) 0:06:42.139 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 04 October 2025 18:55:25 -0400 (0:00:00.312) 0:06:42.474 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 04 October 2025 18:55:25 -0400 (0:00:00.349) 0:06:42.823 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618481.9197884, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759618481.9197884, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1118, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1759618481.9197884, "nlink": 1, "path": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 04 October 2025 18:55:27 -0400 (0:00:01.325) 0:06:44.149 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 04 October 2025 18:55:29 -0400 (0:00:02.392) 0:06:46.542 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.006630", "end": "2025-10-04 18:55:30.463719", "rc": 0, "start": "2025-10-04 18:55:30.457089" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 888b131a-655c-4527-921b-02a5849fd56f Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 689031 Threads: 2 Salt: 97 b7 81 a6 fc e5 d6 91 55 fa 29 44 44 be 28 f7 b1 0a 8b d4 d6 c2 f6 16 0c 58 88 ad 62 68 b1 c1 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 131995 Salt: 24 61 1a 57 17 07 41 7f b1 08 45 f9 6f 6f 92 8b 1f 79 2c 1e 7b ea d7 74 80 6f 61 fe 05 90 dd b6 Digest: 73 c6 0b dc 22 cd 4e 9b e6 a9 4c 42 2d df b6 f0 1c 10 7f 03 7a 76 b2 95 53 21 67 a9 38 5f 05 41 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 04 October 2025 18:55:30 -0400 (0:00:01.281) 0:06:47.824 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 04 October 2025 18:55:31 -0400 (0:00:00.690) 0:06:48.514 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 04 October 2025 18:55:32 -0400 (0:00:00.741) 0:06:49.256 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 04 October 2025 18:55:32 -0400 (0:00:00.241) 0:06:49.498 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 04 October 2025 18:55:33 -0400 (0:00:00.963) 0:06:50.461 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 04 October 2025 18:55:34 -0400 (0:00:00.792) 0:06:51.253 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 04 October 2025 18:55:34 -0400 (0:00:00.389) 0:06:51.643 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 04 October 2025 18:55:34 -0400 (0:00:00.394) 0:06:52.037 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-888b131a-655c-4527-921b-02a5849fd56f /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 04 October 2025 18:55:35 -0400 (0:00:00.733) 0:06:52.771 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 04 October 2025 18:55:36 -0400 (0:00:00.676) 0:06:53.447 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 04 October 2025 18:55:37 -0400 (0:00:00.780) 0:06:54.228 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 04 October 2025 18:55:37 -0400 (0:00:00.705) 0:06:54.933 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 04 October 2025 18:55:38 -0400 (0:00:00.742) 0:06:55.675 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 04 October 2025 18:55:38 -0400 (0:00:00.314) 0:06:55.990 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 04 October 2025 18:55:39 -0400 (0:00:00.194) 0:06:56.185 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 04 October 2025 18:55:39 -0400 (0:00:00.228) 0:06:56.414 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 04 October 2025 18:55:39 -0400 (0:00:00.220) 0:06:56.634 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 04 October 2025 18:55:39 -0400 (0:00:00.312) 0:06:56.946 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 04 October 2025 18:55:40 -0400 (0:00:00.284) 0:06:57.231 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 04 October 2025 18:55:40 -0400 (0:00:00.240) 0:06:57.471 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 04 October 2025 18:55:40 -0400 (0:00:00.254) 0:06:57.726 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 04 October 2025 18:55:40 -0400 (0:00:00.221) 0:06:57.948 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 04 October 2025 18:55:41 -0400 (0:00:00.282) 0:06:58.231 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 04 October 2025 18:55:41 -0400 (0:00:00.255) 0:06:58.487 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 04 October 2025 18:55:42 -0400 (0:00:00.613) 0:06:59.100 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 04 October 2025 18:55:42 -0400 (0:00:00.492) 0:06:59.592 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 04 October 2025 18:55:43 -0400 (0:00:00.533) 0:07:00.126 ****** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 04 October 2025 18:55:43 -0400 (0:00:00.339) 0:07:00.465 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 04 October 2025 18:55:44 -0400 (0:00:00.679) 0:07:01.144 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 04 October 2025 18:55:44 -0400 (0:00:00.611) 0:07:01.755 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 04 October 2025 18:55:45 -0400 (0:00:00.537) 0:07:02.292 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 04 October 2025 18:55:45 -0400 (0:00:00.532) 0:07:02.825 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 04 October 2025 18:55:46 -0400 (0:00:00.579) 0:07:03.404 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 04 October 2025 18:55:46 -0400 (0:00:00.369) 0:07:03.774 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 04 October 2025 18:55:47 -0400 (0:00:00.395) 0:07:04.169 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 04 October 2025 18:55:47 -0400 (0:00:00.333) 0:07:04.503 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 04 October 2025 18:55:47 -0400 (0:00:00.316) 0:07:04.819 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 04 October 2025 18:55:48 -0400 (0:00:00.310) 0:07:05.129 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 04 October 2025 18:55:48 -0400 (0:00:00.383) 0:07:05.513 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 04 October 2025 18:55:48 -0400 (0:00:00.297) 0:07:05.818 ****** skipping: [managed-node8] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 04 October 2025 18:55:49 -0400 (0:00:00.348) 0:07:06.167 ****** skipping: [managed-node8] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 04 October 2025 18:55:49 -0400 (0:00:00.336) 0:07:06.503 ****** skipping: [managed-node8] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 04 October 2025 18:55:49 -0400 (0:00:00.370) 0:07:06.874 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 04 October 2025 18:55:50 -0400 (0:00:00.323) 0:07:07.198 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 04 October 2025 18:55:50 -0400 (0:00:00.356) 0:07:07.554 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 04 October 2025 18:55:50 -0400 (0:00:00.346) 0:07:07.900 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 04 October 2025 18:55:51 -0400 (0:00:00.262) 0:07:08.162 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 04 October 2025 18:55:51 -0400 (0:00:00.291) 0:07:08.454 ****** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 04 October 2025 18:55:51 -0400 (0:00:00.366) 0:07:08.821 ****** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 04 October 2025 18:55:52 -0400 (0:00:00.300) 0:07:09.121 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 04 October 2025 18:55:52 -0400 (0:00:00.568) 0:07:09.690 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 04 October 2025 18:55:52 -0400 (0:00:00.239) 0:07:09.930 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 04 October 2025 18:55:53 -0400 (0:00:00.337) 0:07:10.268 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 04 October 2025 18:55:53 -0400 (0:00:00.282) 0:07:10.551 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 04 October 2025 18:55:53 -0400 (0:00:00.281) 0:07:10.832 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 04 October 2025 18:55:54 -0400 (0:00:00.255) 0:07:11.088 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 04 October 2025 18:55:54 -0400 (0:00:00.253) 0:07:11.342 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 04 October 2025 18:55:54 -0400 (0:00:00.240) 0:07:11.583 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 04 October 2025 18:55:54 -0400 (0:00:00.325) 0:07:11.908 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key - 2] ********* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:203 Saturday 04 October 2025 18:55:55 -0400 (0:00:00.301) 0:07:12.210 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 04 October 2025 18:55:56 -0400 (0:00:00.921) 0:07:13.131 ****** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 04 October 2025 18:55:56 -0400 (0:00:00.731) 0:07:13.863 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 18:55:57 -0400 (0:00:00.509) 0:07:14.372 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 18:55:57 -0400 (0:00:00.467) 0:07:14.839 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 18:55:58 -0400 (0:00:00.724) 0:07:15.564 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 18:55:59 -0400 (0:00:00.663) 0:07:16.228 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 18:56:00 -0400 (0:00:00.928) 0:07:17.156 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 18:56:00 -0400 (0:00:00.349) 0:07:17.505 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 18:56:00 -0400 (0:00:00.325) 0:07:17.831 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 18:56:01 -0400 (0:00:00.343) 0:07:18.174 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 18:56:02 -0400 (0:00:00.944) 0:07:19.119 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 18:56:04 -0400 (0:00:02.432) 0:07:21.551 ****** ok: [managed-node8] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 18:56:05 -0400 (0:00:00.717) 0:07:22.269 ****** ok: [managed-node8] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 18:56:05 -0400 (0:00:00.518) 0:07:22.788 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 18:56:08 -0400 (0:00:02.304) 0:07:25.092 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 18:56:08 -0400 (0:00:00.556) 0:07:25.648 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 18:56:09 -0400 (0:00:00.435) 0:07:26.084 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 18:56:09 -0400 (0:00:00.572) 0:07:26.657 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 18:56:10 -0400 (0:00:00.562) 0:07:27.219 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 18:56:12 -0400 (0:00:02.358) 0:07:29.581 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 18:56:16 -0400 (0:00:03.716) 0:07:33.298 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 18:56:17 -0400 (0:00:00.881) 0:07:34.179 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 18:56:17 -0400 (0:00:00.211) 0:07:34.395 ****** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Saturday 04 October 2025 18:56:19 -0400 (0:00:02.357) 0:07:36.753 ****** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'test1' missing key/password", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 18:56:20 -0400 (0:00:00.488) 0:07:37.241 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 04 October 2025 18:56:20 -0400 (0:00:00.213) 0:07:37.454 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 04 October 2025 18:56:20 -0400 (0:00:00.293) 0:07:37.747 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 04 October 2025 18:56:21 -0400 (0:00:00.464) 0:07:38.212 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:223 Saturday 04 October 2025 18:56:21 -0400 (0:00:00.270) 0:07:38.482 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 18:56:22 -0400 (0:00:01.479) 0:07:39.962 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 18:56:23 -0400 (0:00:00.430) 0:07:40.393 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 18:56:23 -0400 (0:00:00.649) 0:07:41.042 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 18:56:24 -0400 (0:00:00.763) 0:07:41.806 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 18:56:25 -0400 (0:00:00.270) 0:07:42.077 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 18:56:25 -0400 (0:00:00.266) 0:07:42.343 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 18:56:25 -0400 (0:00:00.228) 0:07:42.571 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 18:56:25 -0400 (0:00:00.277) 0:07:42.850 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 18:56:26 -0400 (0:00:00.725) 0:07:43.575 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 18:56:28 -0400 (0:00:02.450) 0:07:46.026 ****** ok: [managed-node8] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 18:56:29 -0400 (0:00:00.589) 0:07:46.616 ****** ok: [managed-node8] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 18:56:30 -0400 (0:00:00.581) 0:07:47.197 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 18:56:32 -0400 (0:00:02.190) 0:07:49.387 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 18:56:32 -0400 (0:00:00.503) 0:07:49.891 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 18:56:33 -0400 (0:00:00.439) 0:07:50.331 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 18:56:33 -0400 (0:00:00.617) 0:07:50.948 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 18:56:34 -0400 (0:00:00.553) 0:07:51.502 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 18:56:36 -0400 (0:00:02.314) 0:07:53.816 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 18:56:39 -0400 (0:00:02.904) 0:07:56.721 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 18:56:40 -0400 (0:00:00.826) 0:07:57.548 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 18:56:40 -0400 (0:00:00.216) 0:07:57.764 ****** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-888b131a-655c-4527-921b-02a5849fd56f", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-b03103b5-9920-48d2-9376-8a11631c90a5", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Saturday 04 October 2025 18:56:53 -0400 (0:00:12.465) 0:08:10.229 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Saturday 04 October 2025 18:56:53 -0400 (0:00:00.473) 0:08:10.703 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618492.6028817, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a1f8aa67a9d19f646f414c9417033809b923b946", "ctime": 1759618492.5998816, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 683671747, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1759618492.5998816, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1448, "uid": 0, "version": "2862150521", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Saturday 04 October 2025 18:56:54 -0400 (0:00:01.190) 0:08:11.893 ****** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 18:56:56 -0400 (0:00:01.282) 0:08:13.176 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Saturday 04 October 2025 18:56:56 -0400 (0:00:00.255) 0:08:13.432 ****** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-888b131a-655c-4527-921b-02a5849fd56f", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-b03103b5-9920-48d2-9376-8a11631c90a5", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Saturday 04 October 2025 18:56:57 -0400 (0:00:01.237) 0:08:14.670 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Saturday 04 October 2025 18:56:58 -0400 (0:00:00.395) 0:08:15.065 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Saturday 04 October 2025 18:56:58 -0400 (0:00:00.302) 0:08:15.368 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': '/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-888b131a-655c-4527-921b-02a5849fd56f" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Saturday 04 October 2025 18:56:59 -0400 (0:00:01.682) 0:08:17.051 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Saturday 04 October 2025 18:57:01 -0400 (0:00:01.913) 0:08:18.965 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': '/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Saturday 04 October 2025 18:57:03 -0400 (0:00:01.750) 0:08:20.716 ****** skipping: [managed-node8] => (item={'src': '/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Saturday 04 October 2025 18:57:04 -0400 (0:00:00.797) 0:08:21.514 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Saturday 04 October 2025 18:57:06 -0400 (0:00:01.823) 0:08:23.337 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618507.1240084, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "16fdd7eb0ddbc74adfd0a6ea77fb2b0b33123ea2", "ctime": 1759618498.14393, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 176160965, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1759618498.1441777, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "1198228870", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Saturday 04 October 2025 18:57:07 -0400 (0:00:01.204) 0:08:24.541 ****** changed: [managed-node8] => (item={'backing_device': '/dev/sda', 'name': 'luks-888b131a-655c-4527-921b-02a5849fd56f', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-888b131a-655c-4527-921b-02a5849fd56f", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node8] => (item={'backing_device': '/dev/sda1', 'name': 'luks-b03103b5-9920-48d2-9376-8a11631c90a5', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-b03103b5-9920-48d2-9376-8a11631c90a5", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Saturday 04 October 2025 18:57:10 -0400 (0:00:02.523) 0:08:27.065 ****** ok: [managed-node8] TASK [Verify role results - 4] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:241 Saturday 04 October 2025 18:57:12 -0400 (0:00:02.146) 0:08:29.211 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 04 October 2025 18:57:13 -0400 (0:00:00.976) 0:08:30.188 ****** ok: [managed-node8] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 04 October 2025 18:57:13 -0400 (0:00:00.763) 0:08:30.951 ****** skipping: [managed-node8] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 04 October 2025 18:57:14 -0400 (0:00:00.728) 0:08:31.680 ****** ok: [managed-node8] => { "changed": false, "info": { "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "size": "4G", "type": "crypt", "uuid": "2efef783-4021-476f-a847-3e4268172fdd" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "b03103b5-9920-48d2-9376-8a11631c90a5" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "8f91d125-a2a0-4395-9c60-8981f3d516e2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 04 October 2025 18:57:15 -0400 (0:00:01.337) 0:08:33.017 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.003657", "end": "2025-10-04 18:57:17.871818", "rc": 0, "start": "2025-10-04 18:57:16.868161" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Sep 30 13:43:10 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=8f91d125-a2a0-4395-9c60-8981f3d516e2 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 04 October 2025 18:57:18 -0400 (0:00:02.225) 0:08:35.242 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002483", "end": "2025-10-04 18:57:19.144591", "failed_when_result": false, "rc": 0, "start": "2025-10-04 18:57:19.142108" } STDOUT: luks-b03103b5-9920-48d2-9376-8a11631c90a5 /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 04 October 2025 18:57:19 -0400 (0:00:01.234) 0:08:36.477 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node8 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 04 October 2025 18:57:20 -0400 (0:00:00.652) 0:08:37.130 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 04 October 2025 18:57:20 -0400 (0:00:00.338) 0:08:37.468 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 04 October 2025 18:57:20 -0400 (0:00:00.304) 0:08:37.773 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 04 October 2025 18:57:20 -0400 (0:00:00.252) 0:08:38.026 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node8 => (item=members) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node8 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 04 October 2025 18:57:21 -0400 (0:00:00.812) 0:08:38.838 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 04 October 2025 18:57:22 -0400 (0:00:00.337) 0:08:39.175 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 04 October 2025 18:57:22 -0400 (0:00:00.361) 0:08:39.536 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 04 October 2025 18:57:22 -0400 (0:00:00.262) 0:08:39.799 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 04 October 2025 18:57:22 -0400 (0:00:00.248) 0:08:40.047 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 04 October 2025 18:57:23 -0400 (0:00:00.246) 0:08:40.294 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 04 October 2025 18:57:23 -0400 (0:00:00.279) 0:08:40.573 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 04 October 2025 18:57:23 -0400 (0:00:00.353) 0:08:40.927 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Saturday 04 October 2025 18:57:24 -0400 (0:00:00.260) 0:08:41.188 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Saturday 04 October 2025 18:57:24 -0400 (0:00:00.307) 0:08:41.495 ****** ok: [managed-node8] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:65676): WARNING **: 18:57:25.220: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.173 originally 10.31.44.173 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.173 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.173 originally 10.31.44.173 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6fa89a8227' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.173 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Saturday 04 October 2025 18:57:25 -0400 (0:00:01.136) 0:08:42.631 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Saturday 04 October 2025 18:57:26 -0400 (0:00:00.523) 0:08:43.155 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node8 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 04 October 2025 18:57:26 -0400 (0:00:00.774) 0:08:43.929 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 04 October 2025 18:57:27 -0400 (0:00:00.231) 0:08:44.161 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 04 October 2025 18:57:27 -0400 (0:00:00.278) 0:08:44.440 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 04 October 2025 18:57:27 -0400 (0:00:00.214) 0:08:44.656 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 04 October 2025 18:57:27 -0400 (0:00:00.365) 0:08:45.022 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 04 October 2025 18:57:28 -0400 (0:00:00.248) 0:08:45.270 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 04 October 2025 18:57:28 -0400 (0:00:00.268) 0:08:45.539 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 04 October 2025 18:57:28 -0400 (0:00:00.236) 0:08:45.776 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 04 October 2025 18:57:28 -0400 (0:00:00.259) 0:08:46.036 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 04 October 2025 18:57:29 -0400 (0:00:00.241) 0:08:46.278 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 04 October 2025 18:57:29 -0400 (0:00:00.261) 0:08:46.539 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Saturday 04 October 2025 18:57:29 -0400 (0:00:00.335) 0:08:46.875 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node8 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 04 October 2025 18:57:30 -0400 (0:00:00.733) 0:08:47.608 ****** skipping: [managed-node8] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Saturday 04 October 2025 18:57:30 -0400 (0:00:00.387) 0:08:47.996 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node8 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 04 October 2025 18:57:31 -0400 (0:00:00.852) 0:08:48.848 ****** skipping: [managed-node8] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Saturday 04 October 2025 18:57:32 -0400 (0:00:00.364) 0:08:49.213 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 04 October 2025 18:57:33 -0400 (0:00:00.920) 0:08:50.134 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 04 October 2025 18:57:34 -0400 (0:00:01.541) 0:08:51.675 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 04 October 2025 18:57:35 -0400 (0:00:01.121) 0:08:52.797 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 04 October 2025 18:57:36 -0400 (0:00:00.290) 0:08:53.094 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Saturday 04 October 2025 18:57:36 -0400 (0:00:00.310) 0:08:53.404 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node8 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 04 October 2025 18:57:36 -0400 (0:00:00.636) 0:08:54.041 ****** skipping: [managed-node8] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Saturday 04 October 2025 18:57:37 -0400 (0:00:00.352) 0:08:54.393 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node8 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 04 October 2025 18:57:38 -0400 (0:00:01.046) 0:08:55.439 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 04 October 2025 18:57:38 -0400 (0:00:00.272) 0:08:55.712 ****** skipping: [managed-node8] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Saturday 04 October 2025 18:57:38 -0400 (0:00:00.301) 0:08:56.014 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Saturday 04 October 2025 18:57:39 -0400 (0:00:00.373) 0:08:56.387 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Saturday 04 October 2025 18:57:39 -0400 (0:00:00.304) 0:08:56.692 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Saturday 04 October 2025 18:57:39 -0400 (0:00:00.329) 0:08:57.021 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Saturday 04 October 2025 18:57:40 -0400 (0:00:00.296) 0:08:57.318 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Saturday 04 October 2025 18:57:40 -0400 (0:00:00.299) 0:08:57.618 ****** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 04 October 2025 18:57:40 -0400 (0:00:00.343) 0:08:57.961 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 04 October 2025 18:57:41 -0400 (0:00:00.597) 0:08:58.559 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 04 October 2025 18:57:42 -0400 (0:00:00.634) 0:08:59.193 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 => (item=mount) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 => (item=fstab) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 => (item=fs) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 => (item=device) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 => (item=encryption) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 => (item=md) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 => (item=size) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 04 October 2025 18:57:43 -0400 (0:00:01.791) 0:09:00.985 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 04 October 2025 18:57:44 -0400 (0:00:00.428) 0:09:01.414 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 04 October 2025 18:57:45 -0400 (0:00:00.805) 0:09:02.219 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 04 October 2025 18:57:45 -0400 (0:00:00.807) 0:09:03.026 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 04 October 2025 18:57:46 -0400 (0:00:00.374) 0:09:03.401 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 04 October 2025 18:57:47 -0400 (0:00:00.685) 0:09:04.087 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 04 October 2025 18:57:47 -0400 (0:00:00.781) 0:09:04.869 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 04 October 2025 18:57:48 -0400 (0:00:00.726) 0:09:05.596 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 04 October 2025 18:57:48 -0400 (0:00:00.278) 0:09:05.874 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 04 October 2025 18:57:49 -0400 (0:00:00.276) 0:09:06.151 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 04 October 2025 18:57:49 -0400 (0:00:00.315) 0:09:06.466 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 04 October 2025 18:57:49 -0400 (0:00:00.330) 0:09:06.796 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 04 October 2025 18:57:50 -0400 (0:00:00.879) 0:09:07.676 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 04 October 2025 18:57:51 -0400 (0:00:00.656) 0:09:08.333 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 04 October 2025 18:57:51 -0400 (0:00:00.712) 0:09:09.045 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 04 October 2025 18:57:52 -0400 (0:00:00.642) 0:09:09.688 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 04 October 2025 18:57:53 -0400 (0:00:00.758) 0:09:10.447 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 04 October 2025 18:57:53 -0400 (0:00:00.320) 0:09:10.767 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 04 October 2025 18:57:54 -0400 (0:00:00.685) 0:09:11.452 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 04 October 2025 18:57:55 -0400 (0:00:00.656) 0:09:12.110 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618612.593929, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759618612.593929, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1229, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1759618612.593929, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 04 October 2025 18:57:56 -0400 (0:00:01.075) 0:09:13.193 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 04 October 2025 18:57:56 -0400 (0:00:00.397) 0:09:13.591 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 04 October 2025 18:57:56 -0400 (0:00:00.261) 0:09:13.853 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 04 October 2025 18:57:57 -0400 (0:00:00.321) 0:09:14.175 ****** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 04 October 2025 18:57:57 -0400 (0:00:00.365) 0:09:14.540 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 04 October 2025 18:57:57 -0400 (0:00:00.203) 0:09:14.744 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 04 October 2025 18:57:58 -0400 (0:00:00.345) 0:09:15.089 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618612.8379312, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759618612.8379312, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1299, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1759618612.8379312, "nlink": 1, "path": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 04 October 2025 18:57:59 -0400 (0:00:01.274) 0:09:16.368 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 04 October 2025 18:58:01 -0400 (0:00:02.265) 0:09:18.634 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.006809", "end": "2025-10-04 18:58:02.577788", "rc": 0, "start": "2025-10-04 18:58:02.570979" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: b03103b5-9920-48d2-9376-8a11631c90a5 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 690733 Threads: 2 Salt: 0d d5 6a ce 91 69 3b 4e 82 d5 67 b9 a0 5a e8 77 20 97 61 cd 02 ef 10 24 9c a2 3a 24 af d5 eb 4f AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 131334 Salt: 30 7a 14 71 d1 8d 3a df e4 2a 16 94 80 b4 f2 d0 25 cb ae f8 58 68 a0 58 7f ac ba 37 29 70 c6 b1 Digest: 87 14 da 12 44 6b 9c 4f a7 55 a5 a0 07 23 b6 3a db b4 3d 5f fa d0 00 5c 67 47 91 3f dd a2 d3 10 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 04 October 2025 18:58:02 -0400 (0:00:01.243) 0:09:19.877 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 04 October 2025 18:58:03 -0400 (0:00:00.576) 0:09:20.453 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 04 October 2025 18:58:04 -0400 (0:00:00.698) 0:09:21.151 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 04 October 2025 18:58:04 -0400 (0:00:00.428) 0:09:21.579 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 04 October 2025 18:58:04 -0400 (0:00:00.395) 0:09:21.975 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 04 October 2025 18:58:05 -0400 (0:00:01.004) 0:09:22.979 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 04 October 2025 18:58:06 -0400 (0:00:00.267) 0:09:23.247 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 04 October 2025 18:58:06 -0400 (0:00:00.216) 0:09:23.464 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-b03103b5-9920-48d2-9376-8a11631c90a5 /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 04 October 2025 18:58:07 -0400 (0:00:00.682) 0:09:24.146 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 04 October 2025 18:58:07 -0400 (0:00:00.711) 0:09:24.858 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 04 October 2025 18:58:08 -0400 (0:00:00.648) 0:09:25.506 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 04 October 2025 18:58:10 -0400 (0:00:01.867) 0:09:27.374 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 04 October 2025 18:58:10 -0400 (0:00:00.551) 0:09:27.925 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 04 October 2025 18:58:11 -0400 (0:00:00.281) 0:09:28.207 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 04 October 2025 18:58:11 -0400 (0:00:00.181) 0:09:28.388 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 04 October 2025 18:58:11 -0400 (0:00:00.172) 0:09:28.561 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 04 October 2025 18:58:11 -0400 (0:00:00.193) 0:09:28.755 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 04 October 2025 18:58:11 -0400 (0:00:00.284) 0:09:29.039 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 04 October 2025 18:58:12 -0400 (0:00:00.182) 0:09:29.222 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 04 October 2025 18:58:12 -0400 (0:00:00.238) 0:09:29.461 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 04 October 2025 18:58:12 -0400 (0:00:00.259) 0:09:29.720 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 04 October 2025 18:58:12 -0400 (0:00:00.315) 0:09:30.036 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 04 October 2025 18:58:13 -0400 (0:00:00.254) 0:09:30.290 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 04 October 2025 18:58:13 -0400 (0:00:00.311) 0:09:30.601 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 04 October 2025 18:58:14 -0400 (0:00:00.652) 0:09:31.254 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 04 October 2025 18:58:14 -0400 (0:00:00.620) 0:09:31.875 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 04 October 2025 18:58:15 -0400 (0:00:00.672) 0:09:32.548 ****** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 04 October 2025 18:58:15 -0400 (0:00:00.251) 0:09:32.800 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 04 October 2025 18:58:16 -0400 (0:00:00.600) 0:09:33.400 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 04 October 2025 18:58:16 -0400 (0:00:00.486) 0:09:33.887 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 04 October 2025 18:58:17 -0400 (0:00:00.718) 0:09:34.605 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 04 October 2025 18:58:18 -0400 (0:00:00.644) 0:09:35.250 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 04 October 2025 18:58:18 -0400 (0:00:00.526) 0:09:35.776 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 04 October 2025 18:58:19 -0400 (0:00:00.692) 0:09:36.469 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 04 October 2025 18:58:20 -0400 (0:00:00.627) 0:09:37.096 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 04 October 2025 18:58:20 -0400 (0:00:00.568) 0:09:37.665 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 04 October 2025 18:58:21 -0400 (0:00:00.679) 0:09:38.344 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 04 October 2025 18:58:21 -0400 (0:00:00.691) 0:09:39.035 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 04 October 2025 18:58:22 -0400 (0:00:00.641) 0:09:39.677 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 04 October 2025 18:58:23 -0400 (0:00:00.428) 0:09:40.105 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 04 October 2025 18:58:23 -0400 (0:00:00.776) 0:09:40.882 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 04 October 2025 18:58:24 -0400 (0:00:00.613) 0:09:41.496 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 04 October 2025 18:58:24 -0400 (0:00:00.528) 0:09:42.025 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 04 October 2025 18:58:25 -0400 (0:00:00.498) 0:09:42.524 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 04 October 2025 18:58:26 -0400 (0:00:00.589) 0:09:43.113 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 04 October 2025 18:58:26 -0400 (0:00:00.611) 0:09:43.725 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 04 October 2025 18:58:27 -0400 (0:00:00.689) 0:09:44.415 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 04 October 2025 18:58:28 -0400 (0:00:00.912) 0:09:45.327 ****** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 04 October 2025 18:58:28 -0400 (0:00:00.306) 0:09:45.634 ****** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 04 October 2025 18:58:28 -0400 (0:00:00.277) 0:09:45.911 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 04 October 2025 18:58:29 -0400 (0:00:00.542) 0:09:46.455 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 04 October 2025 18:58:29 -0400 (0:00:00.264) 0:09:46.719 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 04 October 2025 18:58:29 -0400 (0:00:00.290) 0:09:47.010 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 04 October 2025 18:58:30 -0400 (0:00:00.336) 0:09:47.346 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 04 October 2025 18:58:30 -0400 (0:00:00.299) 0:09:47.645 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 04 October 2025 18:58:30 -0400 (0:00:00.206) 0:09:47.852 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 04 October 2025 18:58:31 -0400 (0:00:00.250) 0:09:48.102 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 04 October 2025 18:58:31 -0400 (0:00:00.283) 0:09:48.386 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 04 October 2025 18:58:31 -0400 (0:00:00.239) 0:09:48.626 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 04 October 2025 18:58:32 -0400 (0:00:00.530) 0:09:49.156 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 04 October 2025 18:58:32 -0400 (0:00:00.169) 0:09:49.325 ****** changed: [managed-node8] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 3] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:247 Saturday 04 October 2025 18:58:33 -0400 (0:00:01.213) 0:09:50.539 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 04 October 2025 18:58:34 -0400 (0:00:01.044) 0:09:51.583 ****** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 04 October 2025 18:58:35 -0400 (0:00:00.729) 0:09:52.313 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 18:58:35 -0400 (0:00:00.591) 0:09:52.904 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 18:58:36 -0400 (0:00:00.485) 0:09:53.390 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 18:58:37 -0400 (0:00:00.738) 0:09:54.128 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 18:58:37 -0400 (0:00:00.772) 0:09:54.901 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 18:58:38 -0400 (0:00:00.276) 0:09:55.177 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 18:58:38 -0400 (0:00:00.278) 0:09:55.456 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 18:58:38 -0400 (0:00:00.238) 0:09:55.694 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 18:58:38 -0400 (0:00:00.233) 0:09:55.928 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 18:58:40 -0400 (0:00:01.523) 0:09:57.451 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 18:58:42 -0400 (0:00:02.118) 0:09:59.571 ****** ok: [managed-node8] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 18:58:43 -0400 (0:00:00.518) 0:10:00.090 ****** ok: [managed-node8] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 18:58:43 -0400 (0:00:00.639) 0:10:00.729 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 18:58:46 -0400 (0:00:02.527) 0:10:03.257 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 18:58:46 -0400 (0:00:00.608) 0:10:03.866 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 18:58:47 -0400 (0:00:00.623) 0:10:04.490 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 18:58:48 -0400 (0:00:00.567) 0:10:05.057 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 18:58:48 -0400 (0:00:00.584) 0:10:05.666 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 18:58:50 -0400 (0:00:02.324) 0:10:07.991 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d888b131a\\x2d655c\\x2d4527\\x2d921b\\x2d02a5849fd56f.service": { "name": "systemd-cryptsetup@luks\\x2d888b131a\\x2d655c\\x2d4527\\x2d921b\\x2d02a5849fd56f.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 18:58:54 -0400 (0:00:03.994) 0:10:12.014 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d888b131a\\x2d655c\\x2d4527\\x2d921b\\x2d02a5849fd56f.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 18:58:55 -0400 (0:00:00.645) 0:10:12.662 ****** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d888b131a\x2d655c\x2d4527\x2d921b\x2d02a5849fd56f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d888b131a\\x2d655c\\x2d4527\\x2d921b\\x2d02a5849fd56f.service", "name": "systemd-cryptsetup@luks\\x2d888b131a\\x2d655c\\x2d4527\\x2d921b\\x2d02a5849fd56f.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"system-systemd\\\\x2dcryptsetup.slice\" systemd-udevd-kernel.socket cryptsetup-pre.target dev-sda.device systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2d888b131a\\\\x2d655c\\\\x2d4527\\\\x2d921b\\\\x2d02a5849fd56f.target\" cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-888b131a-655c-4527-921b-02a5849fd56f", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-888b131a-655c-4527-921b-02a5849fd56f /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-888b131a-655c-4527-921b-02a5849fd56f /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-888b131a-655c-4527-921b-02a5849fd56f ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-888b131a-655c-4527-921b-02a5849fd56f ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d888b131a\\x2d655c\\x2d4527\\x2d921b\\x2d02a5849fd56f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d888b131a\\x2d655c\\x2d4527\\x2d921b\\x2d02a5849fd56f.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13686", "LimitNPROCSoft": "13686", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13686", "LimitSIGPENDINGSoft": "13686", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d888b131a\\\\x2d655c\\\\x2d4527\\\\x2d921b\\\\x2d02a5849fd56f.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2025-10-04 18:57:06 EDT", "StateChangeTimestampMonotonic": "2175373114", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21897", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d888b131a\\\\x2d655c\\\\x2d4527\\\\x2d921b\\\\x2d02a5849fd56f.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 18:58:57 -0400 (0:00:01.709) 0:10:14.372 ****** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-b03103b5-9920-48d2-9376-8a11631c90a5' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Saturday 04 October 2025 18:59:00 -0400 (0:00:02.797) 0:10:17.170 ****** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-b03103b5-9920-48d2-9376-8a11631c90a5' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 18:59:00 -0400 (0:00:00.415) 0:10:17.585 ****** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d888b131a\x2d655c\x2d4527\x2d921b\x2d02a5849fd56f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d888b131a\\x2d655c\\x2d4527\\x2d921b\\x2d02a5849fd56f.service", "name": "systemd-cryptsetup@luks\\x2d888b131a\\x2d655c\\x2d4527\\x2d921b\\x2d02a5849fd56f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d888b131a\\x2d655c\\x2d4527\\x2d921b\\x2d02a5849fd56f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d888b131a\\x2d655c\\x2d4527\\x2d921b\\x2d02a5849fd56f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d888b131a\\x2d655c\\x2d4527\\x2d921b\\x2d02a5849fd56f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454148096", "LimitMEMLOCKSoft": "454148096", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13686", "LimitNPROCSoft": "13686", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13686", "LimitSIGPENDINGSoft": "13686", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d888b131a\\x2d655c\\x2d4527\\x2d921b\\x2d02a5849fd56f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d888b131a\\\\x2d655c\\\\x2d4527\\\\x2d921b\\\\x2d02a5849fd56f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21897", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 04 October 2025 18:59:02 -0400 (0:00:01.612) 0:10:19.197 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 04 October 2025 18:59:02 -0400 (0:00:00.325) 0:10:19.522 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 04 October 2025 18:59:03 -0400 (0:00:00.550) 0:10:20.073 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 04 October 2025 18:59:03 -0400 (0:00:00.362) 0:10:20.435 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618713.1988072, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1759618713.1988072, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1759618713.1988072, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "489264936", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 04 October 2025 18:59:04 -0400 (0:00:01.332) 0:10:21.767 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 2] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:272 Saturday 04 October 2025 18:59:05 -0400 (0:00:00.429) 0:10:22.197 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 18:59:06 -0400 (0:00:01.787) 0:10:23.985 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 18:59:07 -0400 (0:00:00.520) 0:10:24.506 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 18:59:08 -0400 (0:00:00.756) 0:10:25.262 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 18:59:09 -0400 (0:00:00.798) 0:10:26.060 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 18:59:09 -0400 (0:00:00.354) 0:10:26.415 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 18:59:09 -0400 (0:00:00.339) 0:10:26.755 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 18:59:10 -0400 (0:00:00.374) 0:10:27.129 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 18:59:10 -0400 (0:00:00.295) 0:10:27.424 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 18:59:11 -0400 (0:00:00.748) 0:10:28.173 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 18:59:13 -0400 (0:00:02.459) 0:10:30.632 ****** ok: [managed-node8] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 18:59:14 -0400 (0:00:00.678) 0:10:31.311 ****** ok: [managed-node8] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 18:59:14 -0400 (0:00:00.710) 0:10:32.021 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 18:59:17 -0400 (0:00:02.696) 0:10:34.717 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 18:59:18 -0400 (0:00:00.594) 0:10:35.311 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 18:59:18 -0400 (0:00:00.500) 0:10:35.812 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 18:59:19 -0400 (0:00:00.565) 0:10:36.377 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 18:59:20 -0400 (0:00:00.735) 0:10:37.113 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 18:59:22 -0400 (0:00:02.752) 0:10:39.865 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service": { "name": "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 18:59:27 -0400 (0:00:05.136) 0:10:45.001 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 18:59:28 -0400 (0:00:00.973) 0:10:45.975 ****** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2db03103b5\x2d9920\x2d48d2\x2d9376\x2d8a11631c90a5.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "name": "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket systemd-udevd-kernel.socket cryptsetup-pre.target dev-sda1.device", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2db03103b5\\\\x2d9920\\\\x2d48d2\\\\x2d9376\\\\x2d8a11631c90a5.target\"", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-b03103b5-9920-48d2-9376-8a11631c90a5", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-b03103b5-9920-48d2-9376-8a11631c90a5 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-b03103b5-9920-48d2-9376-8a11631c90a5 /dev/sda1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-b03103b5-9920-48d2-9376-8a11631c90a5 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-b03103b5-9920-48d2-9376-8a11631c90a5 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13686", "LimitNPROCSoft": "13686", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13686", "LimitSIGPENDINGSoft": "13686", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2db03103b5\\\\x2d9920\\\\x2d48d2\\\\x2d9376\\\\x2d8a11631c90a5.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2db03103b5\\\\x2d9920\\\\x2d48d2\\\\x2d9376\\\\x2d8a11631c90a5.device\"", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2025-10-04 18:59:01 EDT", "StateChangeTimestampMonotonic": "2291218243", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21897", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2db03103b5\\\\x2d9920\\\\x2d48d2\\\\x2d9376\\\\x2d8a11631c90a5.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 18:59:30 -0400 (0:00:01.737) 0:10:47.734 ****** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-b03103b5-9920-48d2-9376-8a11631c90a5", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=dda2579b-d955-40cd-b935-01fe024f1450", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=dda2579b-d955-40cd-b935-01fe024f1450", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Saturday 04 October 2025 18:59:33 -0400 (0:00:03.152) 0:10:50.886 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Saturday 04 October 2025 18:59:34 -0400 (0:00:00.446) 0:10:51.359 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618623.4060235, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "5b8fbf120d7186d4ac4fcddc94a3cf424d123ebc", "ctime": 1759618623.4030235, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 683671747, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1759618623.4030235, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1448, "uid": 0, "version": "2862150521", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Saturday 04 October 2025 18:59:35 -0400 (0:00:01.220) 0:10:52.580 ****** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 18:59:36 -0400 (0:00:01.352) 0:10:53.932 ****** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2db03103b5\x2d9920\x2d48d2\x2d9376\x2d8a11631c90a5.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "name": "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454148096", "LimitMEMLOCKSoft": "454148096", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13686", "LimitNPROCSoft": "13686", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13686", "LimitSIGPENDINGSoft": "13686", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2db03103b5\\\\x2d9920\\\\x2d48d2\\\\x2d9376\\\\x2d8a11631c90a5.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2db03103b5\\\\x2d9920\\\\x2d48d2\\\\x2d9376\\\\x2d8a11631c90a5.device\" cryptsetup.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2025-10-04 18:59:01 EDT", "StateChangeTimestampMonotonic": "2291218243", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21897", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Saturday 04 October 2025 18:59:38 -0400 (0:00:01.624) 0:10:55.557 ****** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-b03103b5-9920-48d2-9376-8a11631c90a5", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=dda2579b-d955-40cd-b935-01fe024f1450", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=dda2579b-d955-40cd-b935-01fe024f1450", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Saturday 04 October 2025 18:59:39 -0400 (0:00:01.350) 0:10:56.908 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=dda2579b-d955-40cd-b935-01fe024f1450", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Saturday 04 October 2025 18:59:40 -0400 (0:00:00.409) 0:10:57.317 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Saturday 04 October 2025 18:59:40 -0400 (0:00:00.391) 0:10:57.709 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': '/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-b03103b5-9920-48d2-9376-8a11631c90a5" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Saturday 04 October 2025 18:59:42 -0400 (0:00:01.746) 0:10:59.456 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Saturday 04 October 2025 18:59:44 -0400 (0:00:01.975) 0:11:01.432 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': 'UUID=dda2579b-d955-40cd-b935-01fe024f1450', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=dda2579b-d955-40cd-b935-01fe024f1450", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=dda2579b-d955-40cd-b935-01fe024f1450" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Saturday 04 October 2025 18:59:46 -0400 (0:00:01.858) 0:11:03.317 ****** skipping: [managed-node8] => (item={'src': 'UUID=dda2579b-d955-40cd-b935-01fe024f1450', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=dda2579b-d955-40cd-b935-01fe024f1450", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Saturday 04 October 2025 18:59:47 -0400 (0:00:00.826) 0:11:04.143 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Saturday 04 October 2025 18:59:49 -0400 (0:00:02.012) 0:11:06.156 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618639.1431608, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "d096647ec6e0fd99b8d120618653400813e62541", "ctime": 1759618629.769079, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 448791325, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1759618629.769507, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "1272891951", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Saturday 04 October 2025 18:59:50 -0400 (0:00:01.304) 0:11:07.461 ****** changed: [managed-node8] => (item={'backing_device': '/dev/sda1', 'name': 'luks-b03103b5-9920-48d2-9376-8a11631c90a5', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-b03103b5-9920-48d2-9376-8a11631c90a5", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Saturday 04 October 2025 18:59:52 -0400 (0:00:01.882) 0:11:09.344 ****** ok: [managed-node8] TASK [Verify role results - 5] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:290 Saturday 04 October 2025 18:59:54 -0400 (0:00:02.175) 0:11:11.519 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 04 October 2025 18:59:55 -0400 (0:00:01.172) 0:11:12.691 ****** ok: [managed-node8] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=dda2579b-d955-40cd-b935-01fe024f1450", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 04 October 2025 18:59:56 -0400 (0:00:00.708) 0:11:13.400 ****** skipping: [managed-node8] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 04 October 2025 18:59:56 -0400 (0:00:00.575) 0:11:13.976 ****** ok: [managed-node8] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "dda2579b-d955-40cd-b935-01fe024f1450" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "8f91d125-a2a0-4395-9c60-8981f3d516e2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 04 October 2025 18:59:58 -0400 (0:00:01.300) 0:11:15.276 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002901", "end": "2025-10-04 18:59:59.213966", "rc": 0, "start": "2025-10-04 18:59:59.211065" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Sep 30 13:43:10 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=8f91d125-a2a0-4395-9c60-8981f3d516e2 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=dda2579b-d955-40cd-b935-01fe024f1450 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 04 October 2025 18:59:59 -0400 (0:00:01.241) 0:11:16.518 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002606", "end": "2025-10-04 19:00:00.516978", "failed_when_result": false, "rc": 0, "start": "2025-10-04 19:00:00.514372" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 04 October 2025 19:00:00 -0400 (0:00:01.303) 0:11:17.822 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node8 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=dda2579b-d955-40cd-b935-01fe024f1450', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 04 October 2025 19:00:01 -0400 (0:00:00.974) 0:11:18.796 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 04 October 2025 19:00:02 -0400 (0:00:00.336) 0:11:19.133 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 04 October 2025 19:00:02 -0400 (0:00:00.261) 0:11:19.394 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 04 October 2025 19:00:02 -0400 (0:00:00.225) 0:11:19.619 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node8 => (item=members) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node8 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 04 October 2025 19:00:03 -0400 (0:00:00.681) 0:11:20.301 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 04 October 2025 19:00:03 -0400 (0:00:00.096) 0:11:20.398 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 04 October 2025 19:00:03 -0400 (0:00:00.150) 0:11:20.549 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 04 October 2025 19:00:03 -0400 (0:00:00.266) 0:11:20.815 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 04 October 2025 19:00:04 -0400 (0:00:00.287) 0:11:21.103 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 04 October 2025 19:00:04 -0400 (0:00:00.215) 0:11:21.318 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 04 October 2025 19:00:04 -0400 (0:00:00.295) 0:11:21.614 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 04 October 2025 19:00:04 -0400 (0:00:00.337) 0:11:21.952 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Saturday 04 October 2025 19:00:05 -0400 (0:00:00.252) 0:11:22.204 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Saturday 04 October 2025 19:00:05 -0400 (0:00:00.222) 0:11:22.426 ****** ok: [managed-node8] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:71281): WARNING **: 19:00:06.189: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.173 originally 10.31.44.173 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.173 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.173 originally 10.31.44.173 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6fa89a8227' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.173 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Saturday 04 October 2025 19:00:06 -0400 (0:00:01.132) 0:11:23.559 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Saturday 04 October 2025 19:00:06 -0400 (0:00:00.492) 0:11:24.051 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node8 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 04 October 2025 19:00:07 -0400 (0:00:00.826) 0:11:24.878 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 04 October 2025 19:00:08 -0400 (0:00:00.273) 0:11:25.151 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 04 October 2025 19:00:08 -0400 (0:00:00.227) 0:11:25.378 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 04 October 2025 19:00:08 -0400 (0:00:00.258) 0:11:25.637 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 04 October 2025 19:00:08 -0400 (0:00:00.238) 0:11:25.876 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 04 October 2025 19:00:09 -0400 (0:00:00.296) 0:11:26.172 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 04 October 2025 19:00:09 -0400 (0:00:00.269) 0:11:26.441 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 04 October 2025 19:00:09 -0400 (0:00:00.226) 0:11:26.668 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 04 October 2025 19:00:09 -0400 (0:00:00.286) 0:11:26.954 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 04 October 2025 19:00:10 -0400 (0:00:00.269) 0:11:27.223 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 04 October 2025 19:00:10 -0400 (0:00:00.279) 0:11:27.503 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Saturday 04 October 2025 19:00:10 -0400 (0:00:00.237) 0:11:27.741 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node8 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 04 October 2025 19:00:11 -0400 (0:00:00.753) 0:11:28.494 ****** skipping: [managed-node8] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=dda2579b-d955-40cd-b935-01fe024f1450', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=dda2579b-d955-40cd-b935-01fe024f1450", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Saturday 04 October 2025 19:00:11 -0400 (0:00:00.416) 0:11:28.910 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node8 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 04 October 2025 19:00:12 -0400 (0:00:00.722) 0:11:29.633 ****** skipping: [managed-node8] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=dda2579b-d955-40cd-b935-01fe024f1450', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=dda2579b-d955-40cd-b935-01fe024f1450", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Saturday 04 October 2025 19:00:12 -0400 (0:00:00.341) 0:11:29.975 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 04 October 2025 19:00:13 -0400 (0:00:00.783) 0:11:30.759 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 04 October 2025 19:00:14 -0400 (0:00:00.631) 0:11:31.391 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 04 October 2025 19:00:14 -0400 (0:00:00.231) 0:11:31.622 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 04 October 2025 19:00:15 -0400 (0:00:01.193) 0:11:32.816 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Saturday 04 October 2025 19:00:16 -0400 (0:00:00.291) 0:11:33.107 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node8 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 04 October 2025 19:00:16 -0400 (0:00:00.714) 0:11:33.822 ****** skipping: [managed-node8] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=dda2579b-d955-40cd-b935-01fe024f1450', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=dda2579b-d955-40cd-b935-01fe024f1450", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Saturday 04 October 2025 19:00:17 -0400 (0:00:00.417) 0:11:34.239 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node8 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 04 October 2025 19:00:18 -0400 (0:00:01.053) 0:11:35.293 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 04 October 2025 19:00:18 -0400 (0:00:00.297) 0:11:35.591 ****** skipping: [managed-node8] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Saturday 04 October 2025 19:00:18 -0400 (0:00:00.341) 0:11:35.932 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Saturday 04 October 2025 19:00:19 -0400 (0:00:00.269) 0:11:36.202 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Saturday 04 October 2025 19:00:19 -0400 (0:00:00.277) 0:11:36.480 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Saturday 04 October 2025 19:00:19 -0400 (0:00:00.314) 0:11:36.794 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Saturday 04 October 2025 19:00:19 -0400 (0:00:00.244) 0:11:37.039 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Saturday 04 October 2025 19:00:20 -0400 (0:00:00.294) 0:11:37.333 ****** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 04 October 2025 19:00:20 -0400 (0:00:00.246) 0:11:37.579 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=dda2579b-d955-40cd-b935-01fe024f1450', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 04 October 2025 19:00:21 -0400 (0:00:00.486) 0:11:38.066 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 04 October 2025 19:00:21 -0400 (0:00:00.965) 0:11:39.031 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 => (item=mount) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 => (item=fstab) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 => (item=fs) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 => (item=device) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 => (item=encryption) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 => (item=md) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 => (item=size) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 04 October 2025 19:00:23 -0400 (0:00:01.846) 0:11:40.878 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 04 October 2025 19:00:24 -0400 (0:00:00.477) 0:11:41.355 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 04 October 2025 19:00:25 -0400 (0:00:00.718) 0:11:42.073 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 04 October 2025 19:00:25 -0400 (0:00:00.717) 0:11:42.791 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 04 October 2025 19:00:26 -0400 (0:00:00.329) 0:11:43.121 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 04 October 2025 19:00:26 -0400 (0:00:00.701) 0:11:43.822 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 04 October 2025 19:00:27 -0400 (0:00:00.885) 0:11:44.707 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 04 October 2025 19:00:28 -0400 (0:00:00.766) 0:11:45.474 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 04 October 2025 19:00:28 -0400 (0:00:00.274) 0:11:45.748 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 04 October 2025 19:00:28 -0400 (0:00:00.268) 0:11:46.017 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 04 October 2025 19:00:29 -0400 (0:00:00.279) 0:11:46.296 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 04 October 2025 19:00:29 -0400 (0:00:00.283) 0:11:46.580 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=dda2579b-d955-40cd-b935-01fe024f1450 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 04 October 2025 19:00:30 -0400 (0:00:01.035) 0:11:47.616 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 04 October 2025 19:00:31 -0400 (0:00:00.665) 0:11:48.282 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 04 October 2025 19:00:31 -0400 (0:00:00.660) 0:11:48.942 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 04 October 2025 19:00:32 -0400 (0:00:00.597) 0:11:49.539 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 04 October 2025 19:00:33 -0400 (0:00:00.558) 0:11:50.097 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 04 October 2025 19:00:33 -0400 (0:00:00.240) 0:11:50.337 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 04 October 2025 19:00:33 -0400 (0:00:00.681) 0:11:51.019 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 04 October 2025 19:00:34 -0400 (0:00:00.509) 0:11:51.529 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618773.5003338, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759618773.5003338, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1229, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1759618773.5003338, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 04 October 2025 19:00:35 -0400 (0:00:01.272) 0:11:52.801 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 04 October 2025 19:00:36 -0400 (0:00:00.383) 0:11:53.185 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 04 October 2025 19:00:36 -0400 (0:00:00.268) 0:11:53.454 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 04 October 2025 19:00:36 -0400 (0:00:00.357) 0:11:53.811 ****** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 04 October 2025 19:00:37 -0400 (0:00:00.335) 0:11:54.147 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 04 October 2025 19:00:37 -0400 (0:00:00.318) 0:11:54.465 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 04 October 2025 19:00:37 -0400 (0:00:00.347) 0:11:54.812 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 04 October 2025 19:00:38 -0400 (0:00:00.296) 0:11:55.109 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 04 October 2025 19:00:40 -0400 (0:00:02.410) 0:11:57.519 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 04 October 2025 19:00:40 -0400 (0:00:00.236) 0:11:57.756 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 04 October 2025 19:00:41 -0400 (0:00:00.351) 0:11:58.108 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 04 October 2025 19:00:41 -0400 (0:00:00.868) 0:11:58.976 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 04 October 2025 19:00:42 -0400 (0:00:00.254) 0:11:59.230 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 04 October 2025 19:00:42 -0400 (0:00:00.227) 0:11:59.457 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 04 October 2025 19:00:42 -0400 (0:00:00.261) 0:11:59.719 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 04 October 2025 19:00:42 -0400 (0:00:00.210) 0:11:59.929 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 04 October 2025 19:00:43 -0400 (0:00:00.266) 0:12:00.196 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 04 October 2025 19:00:43 -0400 (0:00:00.636) 0:12:00.832 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 04 October 2025 19:00:44 -0400 (0:00:00.567) 0:12:01.400 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 04 October 2025 19:00:44 -0400 (0:00:00.622) 0:12:02.022 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 04 October 2025 19:00:45 -0400 (0:00:00.733) 0:12:02.756 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 04 October 2025 19:00:46 -0400 (0:00:00.572) 0:12:03.328 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 04 October 2025 19:00:46 -0400 (0:00:00.299) 0:12:03.627 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 04 October 2025 19:00:46 -0400 (0:00:00.249) 0:12:03.877 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 04 October 2025 19:00:47 -0400 (0:00:00.301) 0:12:04.179 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 04 October 2025 19:00:47 -0400 (0:00:00.242) 0:12:04.422 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 04 October 2025 19:00:47 -0400 (0:00:00.263) 0:12:04.685 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 04 October 2025 19:00:47 -0400 (0:00:00.243) 0:12:04.928 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 04 October 2025 19:00:49 -0400 (0:00:01.277) 0:12:06.206 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 04 October 2025 19:00:49 -0400 (0:00:00.339) 0:12:06.546 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 04 October 2025 19:00:49 -0400 (0:00:00.257) 0:12:06.803 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 04 October 2025 19:00:50 -0400 (0:00:00.275) 0:12:07.079 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 04 October 2025 19:00:50 -0400 (0:00:00.217) 0:12:07.296 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 04 October 2025 19:00:50 -0400 (0:00:00.574) 0:12:07.870 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 04 October 2025 19:00:51 -0400 (0:00:00.718) 0:12:08.589 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 04 October 2025 19:00:52 -0400 (0:00:00.468) 0:12:09.058 ****** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 04 October 2025 19:00:52 -0400 (0:00:00.264) 0:12:09.322 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 04 October 2025 19:00:52 -0400 (0:00:00.613) 0:12:09.936 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 04 October 2025 19:00:53 -0400 (0:00:00.638) 0:12:10.574 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 04 October 2025 19:00:54 -0400 (0:00:00.633) 0:12:11.207 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 04 October 2025 19:00:54 -0400 (0:00:00.709) 0:12:11.917 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 04 October 2025 19:00:55 -0400 (0:00:00.600) 0:12:12.518 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 04 October 2025 19:00:56 -0400 (0:00:00.672) 0:12:13.190 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 04 October 2025 19:00:56 -0400 (0:00:00.629) 0:12:13.820 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 04 October 2025 19:00:57 -0400 (0:00:00.639) 0:12:14.459 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 04 October 2025 19:00:58 -0400 (0:00:00.660) 0:12:15.119 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 04 October 2025 19:00:58 -0400 (0:00:00.770) 0:12:15.890 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 04 October 2025 19:00:59 -0400 (0:00:00.776) 0:12:16.666 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 04 October 2025 19:01:00 -0400 (0:00:00.697) 0:12:17.364 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 04 October 2025 19:01:01 -0400 (0:00:00.783) 0:12:18.148 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 04 October 2025 19:01:01 -0400 (0:00:00.609) 0:12:18.757 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 04 October 2025 19:01:02 -0400 (0:00:00.579) 0:12:19.336 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 04 October 2025 19:01:02 -0400 (0:00:00.571) 0:12:19.907 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 04 October 2025 19:01:03 -0400 (0:00:00.651) 0:12:20.559 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 04 October 2025 19:01:04 -0400 (0:00:00.639) 0:12:21.220 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 04 October 2025 19:01:04 -0400 (0:00:00.762) 0:12:21.982 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 04 October 2025 19:01:05 -0400 (0:00:00.614) 0:12:22.596 ****** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 04 October 2025 19:01:05 -0400 (0:00:00.326) 0:12:22.923 ****** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 04 October 2025 19:01:06 -0400 (0:00:00.384) 0:12:23.307 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 04 October 2025 19:01:06 -0400 (0:00:00.706) 0:12:24.014 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 04 October 2025 19:01:07 -0400 (0:00:00.254) 0:12:24.268 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 04 October 2025 19:01:07 -0400 (0:00:00.251) 0:12:24.519 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 04 October 2025 19:01:07 -0400 (0:00:00.334) 0:12:24.853 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 04 October 2025 19:01:08 -0400 (0:00:00.313) 0:12:25.167 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 04 October 2025 19:01:08 -0400 (0:00:00.282) 0:12:25.449 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 04 October 2025 19:01:08 -0400 (0:00:00.321) 0:12:25.770 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 04 October 2025 19:01:09 -0400 (0:00:00.304) 0:12:26.074 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 04 October 2025 19:01:09 -0400 (0:00:00.323) 0:12:26.398 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 04 October 2025 19:01:09 -0400 (0:00:00.535) 0:12:26.934 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 04 October 2025 19:01:10 -0400 (0:00:00.288) 0:12:27.222 ****** changed: [managed-node8] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 4] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:296 Saturday 04 October 2025 19:01:11 -0400 (0:00:01.260) 0:12:28.483 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 04 October 2025 19:01:12 -0400 (0:00:01.086) 0:12:29.569 ****** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 04 October 2025 19:01:13 -0400 (0:00:00.653) 0:12:30.223 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 19:01:13 -0400 (0:00:00.547) 0:12:30.770 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 19:01:14 -0400 (0:00:00.435) 0:12:31.206 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 19:01:14 -0400 (0:00:00.640) 0:12:31.846 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 19:01:15 -0400 (0:00:00.684) 0:12:32.531 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 19:01:15 -0400 (0:00:00.351) 0:12:32.883 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 19:01:16 -0400 (0:00:00.339) 0:12:33.222 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 19:01:16 -0400 (0:00:00.307) 0:12:33.530 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 19:01:16 -0400 (0:00:00.279) 0:12:33.809 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 19:01:17 -0400 (0:00:00.617) 0:12:34.426 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 19:01:19 -0400 (0:00:02.381) 0:12:36.808 ****** ok: [managed-node8] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 19:01:20 -0400 (0:00:00.688) 0:12:37.496 ****** ok: [managed-node8] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 19:01:21 -0400 (0:00:00.726) 0:12:38.223 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 19:01:24 -0400 (0:00:03.572) 0:12:41.795 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 19:01:26 -0400 (0:00:01.551) 0:12:43.347 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 19:01:26 -0400 (0:00:00.508) 0:12:43.855 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 19:01:27 -0400 (0:00:00.660) 0:12:44.516 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 19:01:28 -0400 (0:00:00.592) 0:12:45.108 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 19:01:30 -0400 (0:00:02.424) 0:12:47.533 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service": { "name": "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 19:01:33 -0400 (0:00:02.870) 0:12:50.404 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 19:01:34 -0400 (0:00:00.760) 0:12:51.165 ****** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2db03103b5\x2d9920\x2d48d2\x2d9376\x2d8a11631c90a5.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "name": "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target dev-sda1.device \"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2db03103b5\\\\x2d9920\\\\x2d48d2\\\\x2d9376\\\\x2d8a11631c90a5.target\" cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-b03103b5-9920-48d2-9376-8a11631c90a5", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-b03103b5-9920-48d2-9376-8a11631c90a5 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-b03103b5-9920-48d2-9376-8a11631c90a5 /dev/sda1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-b03103b5-9920-48d2-9376-8a11631c90a5 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-b03103b5-9920-48d2-9376-8a11631c90a5 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13686", "LimitNPROCSoft": "13686", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13686", "LimitSIGPENDINGSoft": "13686", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2db03103b5\\\\x2d9920\\\\x2d48d2\\\\x2d9376\\\\x2d8a11631c90a5.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2025-10-04 18:59:01 EDT", "StateChangeTimestampMonotonic": "2291218243", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21897", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2db03103b5\\\\x2d9920\\\\x2d48d2\\\\x2d9376\\\\x2d8a11631c90a5.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 19:01:35 -0400 (0:00:01.481) 0:12:52.646 ****** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Saturday 04 October 2025 19:01:38 -0400 (0:00:02.409) 0:12:55.056 ****** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 19:01:38 -0400 (0:00:00.479) 0:12:55.536 ****** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2db03103b5\x2d9920\x2d48d2\x2d9376\x2d8a11631c90a5.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "name": "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454148096", "LimitMEMLOCKSoft": "454148096", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13686", "LimitNPROCSoft": "13686", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13686", "LimitSIGPENDINGSoft": "13686", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2db03103b5\\x2d9920\\x2d48d2\\x2d9376\\x2d8a11631c90a5.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2db03103b5\\\\x2d9920\\\\x2d48d2\\\\x2d9376\\\\x2d8a11631c90a5.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21897", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 04 October 2025 19:01:40 -0400 (0:00:01.691) 0:12:57.227 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 04 October 2025 19:01:40 -0400 (0:00:00.304) 0:12:57.532 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 04 October 2025 19:01:40 -0400 (0:00:00.466) 0:12:57.998 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 04 October 2025 19:01:41 -0400 (0:00:00.372) 0:12:58.371 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618871.1611862, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1759618871.1611862, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1759618871.1611862, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1846344341", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 04 October 2025 19:01:42 -0400 (0:00:01.050) 0:12:59.421 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:323 Saturday 04 October 2025 19:01:42 -0400 (0:00:00.405) 0:12:59.827 ****** ok: [managed-node8] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testbw1r7k_7lukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:330 Saturday 04 October 2025 19:01:45 -0400 (0:00:02.793) 0:13:02.620 ****** ok: [managed-node8] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testbw1r7k_7lukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1759618905.9044282-126601-71803945212766/.source", "state": "file", "uid": 0 } TASK [Add encryption to the volume - 2] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:337 Saturday 04 October 2025 19:01:49 -0400 (0:00:03.506) 0:13:06.126 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 19:01:49 -0400 (0:00:00.486) 0:13:06.613 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 19:01:49 -0400 (0:00:00.400) 0:13:07.014 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 19:01:50 -0400 (0:00:00.703) 0:13:07.717 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 19:01:51 -0400 (0:00:00.841) 0:13:08.559 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 19:01:51 -0400 (0:00:00.282) 0:13:08.842 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 19:01:52 -0400 (0:00:00.408) 0:13:09.250 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 19:01:52 -0400 (0:00:00.280) 0:13:09.531 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 19:01:52 -0400 (0:00:00.386) 0:13:09.917 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 19:01:53 -0400 (0:00:00.856) 0:13:10.774 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 19:01:56 -0400 (0:00:02.545) 0:13:13.319 ****** ok: [managed-node8] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testbw1r7k_7lukskey", "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 19:01:56 -0400 (0:00:00.644) 0:13:13.964 ****** ok: [managed-node8] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 19:01:57 -0400 (0:00:00.634) 0:13:14.599 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 19:02:00 -0400 (0:00:02.579) 0:13:17.179 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 19:02:00 -0400 (0:00:00.734) 0:13:17.913 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 19:02:01 -0400 (0:00:00.677) 0:13:18.590 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 19:02:02 -0400 (0:00:00.605) 0:13:19.196 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 19:02:02 -0400 (0:00:00.529) 0:13:19.725 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 19:02:04 -0400 (0:00:02.314) 0:13:22.040 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 19:02:08 -0400 (0:00:03.098) 0:13:25.139 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 19:02:08 -0400 (0:00:00.839) 0:13:25.978 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 19:02:09 -0400 (0:00:00.266) 0:13:26.244 ****** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "password": "/tmp/storage_testbw1r7k_7lukskey", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=dda2579b-d955-40cd-b935-01fe024f1450", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testbw1r7k_7lukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Saturday 04 October 2025 19:02:22 -0400 (0:00:12.869) 0:13:39.114 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Saturday 04 October 2025 19:02:22 -0400 (0:00:00.560) 0:13:39.674 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618785.9344423, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "89d5ffdd4f32a4389c05d229b857b27cd1d58a14", "ctime": 1759618785.9314423, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 683671747, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1759618785.9314423, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1436, "uid": 0, "version": "2862150521", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Saturday 04 October 2025 19:02:23 -0400 (0:00:01.276) 0:13:40.950 ****** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 19:02:25 -0400 (0:00:01.351) 0:13:42.302 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Saturday 04 October 2025 19:02:25 -0400 (0:00:00.268) 0:13:42.571 ****** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "password": "/tmp/storage_testbw1r7k_7lukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=dda2579b-d955-40cd-b935-01fe024f1450", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testbw1r7k_7lukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Saturday 04 October 2025 19:02:25 -0400 (0:00:00.434) 0:13:43.006 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testbw1r7k_7lukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Saturday 04 October 2025 19:02:26 -0400 (0:00:00.423) 0:13:43.429 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Saturday 04 October 2025 19:02:26 -0400 (0:00:00.292) 0:13:43.721 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': 'UUID=dda2579b-d955-40cd-b935-01fe024f1450', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=dda2579b-d955-40cd-b935-01fe024f1450", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=dda2579b-d955-40cd-b935-01fe024f1450" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Saturday 04 October 2025 19:02:28 -0400 (0:00:01.756) 0:13:45.483 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Saturday 04 October 2025 19:02:30 -0400 (0:00:02.086) 0:13:47.569 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': '/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Saturday 04 October 2025 19:02:32 -0400 (0:00:01.726) 0:13:49.296 ****** skipping: [managed-node8] => (item={'src': '/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Saturday 04 October 2025 19:02:32 -0400 (0:00:00.679) 0:13:49.975 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Saturday 04 October 2025 19:02:34 -0400 (0:00:01.887) 0:13:51.862 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618800.5155694, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1759618792.0254955, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67109061, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1759618792.0267315, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1524664747", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Saturday 04 October 2025 19:02:36 -0400 (0:00:01.210) 0:13:53.072 ****** changed: [managed-node8] => (item={'backing_device': '/dev/sda1', 'name': 'luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49', 'password': '/tmp/storage_testbw1r7k_7lukskey', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "password": "/tmp/storage_testbw1r7k_7lukskey", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Saturday 04 October 2025 19:02:38 -0400 (0:00:02.554) 0:13:55.627 ****** ok: [managed-node8] TASK [Verify role results - 6] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:355 Saturday 04 October 2025 19:02:42 -0400 (0:00:04.144) 0:13:59.771 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 04 October 2025 19:02:43 -0400 (0:00:00.537) 0:14:00.309 ****** ok: [managed-node8] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testbw1r7k_7lukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 04 October 2025 19:02:44 -0400 (0:00:00.748) 0:14:01.057 ****** skipping: [managed-node8] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 04 October 2025 19:02:44 -0400 (0:00:00.468) 0:14:01.526 ****** ok: [managed-node8] => { "changed": false, "info": { "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "size": "4G", "type": "crypt", "uuid": "a6deaccd-415e-4165-af29-a60e1cab8b55" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "19bbcb22-6ec7-4d41-91bc-3354e1a43d49" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "8f91d125-a2a0-4395-9c60-8981f3d516e2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 04 October 2025 19:02:45 -0400 (0:00:01.268) 0:14:02.795 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002715", "end": "2025-10-04 19:02:46.682537", "rc": 0, "start": "2025-10-04 19:02:46.679822" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Sep 30 13:43:10 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=8f91d125-a2a0-4395-9c60-8981f3d516e2 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 04 October 2025 19:02:46 -0400 (0:00:01.135) 0:14:03.930 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002567", "end": "2025-10-04 19:02:47.852891", "failed_when_result": false, "rc": 0, "start": "2025-10-04 19:02:47.850324" } STDOUT: luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49 /dev/sda1 /tmp/storage_testbw1r7k_7lukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 04 October 2025 19:02:48 -0400 (0:00:01.255) 0:14:05.185 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node8 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testbw1r7k_7lukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 04 October 2025 19:02:49 -0400 (0:00:00.949) 0:14:06.135 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 04 October 2025 19:02:49 -0400 (0:00:00.364) 0:14:06.500 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 04 October 2025 19:02:49 -0400 (0:00:00.286) 0:14:06.787 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 04 October 2025 19:02:50 -0400 (0:00:00.278) 0:14:07.065 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node8 => (item=members) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node8 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 04 October 2025 19:02:50 -0400 (0:00:00.716) 0:14:07.781 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 04 October 2025 19:02:50 -0400 (0:00:00.249) 0:14:08.031 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 04 October 2025 19:02:51 -0400 (0:00:00.213) 0:14:08.245 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 04 October 2025 19:02:51 -0400 (0:00:00.256) 0:14:08.501 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 04 October 2025 19:02:51 -0400 (0:00:00.339) 0:14:08.840 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 04 October 2025 19:02:52 -0400 (0:00:00.240) 0:14:09.081 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 04 October 2025 19:02:52 -0400 (0:00:00.331) 0:14:09.413 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 04 October 2025 19:02:52 -0400 (0:00:00.300) 0:14:09.713 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Saturday 04 October 2025 19:02:52 -0400 (0:00:00.309) 0:14:10.022 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Saturday 04 October 2025 19:02:53 -0400 (0:00:00.237) 0:14:10.260 ****** ok: [managed-node8] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:76954): WARNING **: 19:02:54.047: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.173 originally 10.31.44.173 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.173 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.173 originally 10.31.44.173 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6fa89a8227' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.173 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Saturday 04 October 2025 19:02:54 -0400 (0:00:01.152) 0:14:11.413 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Saturday 04 October 2025 19:02:54 -0400 (0:00:00.573) 0:14:11.986 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node8 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 04 October 2025 19:02:55 -0400 (0:00:00.684) 0:14:12.671 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 04 October 2025 19:02:55 -0400 (0:00:00.250) 0:14:12.922 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 04 October 2025 19:02:56 -0400 (0:00:00.320) 0:14:13.242 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 04 October 2025 19:02:56 -0400 (0:00:00.284) 0:14:13.527 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 04 October 2025 19:02:56 -0400 (0:00:00.274) 0:14:13.801 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 04 October 2025 19:02:57 -0400 (0:00:00.291) 0:14:14.092 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 04 October 2025 19:02:57 -0400 (0:00:00.409) 0:14:14.501 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 04 October 2025 19:02:57 -0400 (0:00:00.232) 0:14:14.734 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 04 October 2025 19:02:57 -0400 (0:00:00.307) 0:14:15.042 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 04 October 2025 19:02:58 -0400 (0:00:00.275) 0:14:15.317 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 04 October 2025 19:02:58 -0400 (0:00:00.248) 0:14:15.566 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Saturday 04 October 2025 19:02:58 -0400 (0:00:00.295) 0:14:15.861 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node8 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 04 October 2025 19:02:59 -0400 (0:00:00.652) 0:14:16.513 ****** skipping: [managed-node8] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testbw1r7k_7lukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testbw1r7k_7lukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Saturday 04 October 2025 19:02:59 -0400 (0:00:00.376) 0:14:16.890 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node8 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 04 October 2025 19:03:00 -0400 (0:00:00.709) 0:14:17.600 ****** skipping: [managed-node8] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testbw1r7k_7lukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testbw1r7k_7lukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Saturday 04 October 2025 19:03:00 -0400 (0:00:00.286) 0:14:17.886 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 04 October 2025 19:03:01 -0400 (0:00:00.819) 0:14:18.706 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 04 October 2025 19:03:02 -0400 (0:00:00.755) 0:14:19.461 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 04 October 2025 19:03:02 -0400 (0:00:00.270) 0:14:19.732 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 04 October 2025 19:03:02 -0400 (0:00:00.263) 0:14:19.996 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Saturday 04 October 2025 19:03:03 -0400 (0:00:00.308) 0:14:20.304 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node8 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 04 October 2025 19:03:04 -0400 (0:00:00.761) 0:14:21.066 ****** skipping: [managed-node8] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testbw1r7k_7lukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testbw1r7k_7lukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Saturday 04 October 2025 19:03:04 -0400 (0:00:00.394) 0:14:21.460 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node8 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 04 October 2025 19:03:05 -0400 (0:00:00.942) 0:14:22.402 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 04 October 2025 19:03:05 -0400 (0:00:00.269) 0:14:22.671 ****** skipping: [managed-node8] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Saturday 04 October 2025 19:03:05 -0400 (0:00:00.257) 0:14:22.929 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Saturday 04 October 2025 19:03:06 -0400 (0:00:00.264) 0:14:23.194 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Saturday 04 October 2025 19:03:06 -0400 (0:00:00.255) 0:14:23.450 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Saturday 04 October 2025 19:03:06 -0400 (0:00:00.317) 0:14:23.767 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Saturday 04 October 2025 19:03:07 -0400 (0:00:00.299) 0:14:24.066 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Saturday 04 October 2025 19:03:07 -0400 (0:00:00.317) 0:14:24.384 ****** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 04 October 2025 19:03:07 -0400 (0:00:00.264) 0:14:24.648 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testbw1r7k_7lukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 04 October 2025 19:03:09 -0400 (0:00:01.416) 0:14:26.065 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 04 October 2025 19:03:09 -0400 (0:00:00.591) 0:14:26.657 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 => (item=mount) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 => (item=fstab) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 => (item=fs) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 => (item=device) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 => (item=encryption) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 => (item=md) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 => (item=size) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 04 October 2025 19:03:11 -0400 (0:00:01.860) 0:14:28.517 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 04 October 2025 19:03:11 -0400 (0:00:00.439) 0:14:28.957 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 04 October 2025 19:03:12 -0400 (0:00:00.749) 0:14:29.707 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 04 October 2025 19:03:13 -0400 (0:00:00.783) 0:14:30.490 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 04 October 2025 19:03:13 -0400 (0:00:00.471) 0:14:30.961 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 04 October 2025 19:03:14 -0400 (0:00:00.769) 0:14:31.731 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 04 October 2025 19:03:15 -0400 (0:00:00.675) 0:14:32.407 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 04 October 2025 19:03:16 -0400 (0:00:00.675) 0:14:33.082 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 04 October 2025 19:03:16 -0400 (0:00:00.337) 0:14:33.420 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 04 October 2025 19:03:16 -0400 (0:00:00.314) 0:14:33.734 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 04 October 2025 19:03:16 -0400 (0:00:00.199) 0:14:33.934 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 04 October 2025 19:03:17 -0400 (0:00:00.245) 0:14:34.179 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 04 October 2025 19:03:18 -0400 (0:00:00.966) 0:14:35.146 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 04 October 2025 19:03:18 -0400 (0:00:00.654) 0:14:35.801 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 04 October 2025 19:03:19 -0400 (0:00:00.686) 0:14:36.488 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 04 October 2025 19:03:20 -0400 (0:00:00.613) 0:14:37.101 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 04 October 2025 19:03:20 -0400 (0:00:00.779) 0:14:37.880 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 04 October 2025 19:03:21 -0400 (0:00:00.274) 0:14:38.155 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 04 October 2025 19:03:21 -0400 (0:00:00.667) 0:14:38.822 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 04 October 2025 19:03:22 -0400 (0:00:00.819) 0:14:39.641 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618941.459799, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759618941.459799, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1229, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1759618941.459799, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 04 October 2025 19:03:23 -0400 (0:00:01.088) 0:14:40.729 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 04 October 2025 19:03:23 -0400 (0:00:00.308) 0:14:41.038 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 04 October 2025 19:03:24 -0400 (0:00:00.265) 0:14:41.303 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 04 October 2025 19:03:24 -0400 (0:00:00.313) 0:14:41.617 ****** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 04 October 2025 19:03:24 -0400 (0:00:00.288) 0:14:41.905 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 04 October 2025 19:03:25 -0400 (0:00:00.248) 0:14:42.154 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 04 October 2025 19:03:25 -0400 (0:00:00.316) 0:14:42.471 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618941.7068012, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759618941.7068012, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1677, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1759618941.7068012, "nlink": 1, "path": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 04 October 2025 19:03:26 -0400 (0:00:01.329) 0:14:43.800 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 04 October 2025 19:03:29 -0400 (0:00:02.394) 0:14:46.195 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.006408", "end": "2025-10-04 19:03:30.238310", "rc": 0, "start": "2025-10-04 19:03:30.231902" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 19bbcb22-6ec7-4d41-91bc-3354e1a43d49 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 689031 Threads: 2 Salt: ed db c5 bc 47 9d 7c f9 77 66 75 5b 53 12 d1 90 c7 47 3b 32 40 69 03 26 d1 0e 07 00 2d 94 32 6b AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 131334 Salt: 55 c4 25 64 cc e9 22 7c df 3f ee cc 31 dc 70 b9 3a ff 1a 1b 8c d0 cf f7 77 7b 9f 9a de bd 86 a2 Digest: b0 d2 55 e0 1c 4e d4 1d 05 b0 50 be f4 2d be 01 cb ed 30 36 a8 5f 09 7e 42 ee 6a 21 c5 a8 91 c5 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 04 October 2025 19:03:30 -0400 (0:00:01.343) 0:14:47.539 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 04 October 2025 19:03:31 -0400 (0:00:00.729) 0:14:48.268 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 04 October 2025 19:03:31 -0400 (0:00:00.679) 0:14:48.948 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 04 October 2025 19:03:32 -0400 (0:00:00.361) 0:14:49.309 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 04 October 2025 19:03:32 -0400 (0:00:00.426) 0:14:49.736 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 04 October 2025 19:03:33 -0400 (0:00:00.806) 0:14:50.542 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 04 October 2025 19:03:33 -0400 (0:00:00.448) 0:14:50.991 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 04 October 2025 19:03:34 -0400 (0:00:00.432) 0:14:51.424 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49 /dev/sda1 /tmp/storage_testbw1r7k_7lukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_testbw1r7k_7lukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 04 October 2025 19:03:35 -0400 (0:00:00.824) 0:14:52.248 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 04 October 2025 19:03:35 -0400 (0:00:00.698) 0:14:52.946 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 04 October 2025 19:03:36 -0400 (0:00:00.781) 0:14:53.728 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 04 October 2025 19:03:37 -0400 (0:00:00.783) 0:14:54.511 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 04 October 2025 19:03:38 -0400 (0:00:00.702) 0:14:55.214 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 04 October 2025 19:03:38 -0400 (0:00:00.372) 0:14:55.586 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 04 October 2025 19:03:38 -0400 (0:00:00.319) 0:14:55.906 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 04 October 2025 19:03:39 -0400 (0:00:00.249) 0:14:56.156 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 04 October 2025 19:03:39 -0400 (0:00:00.285) 0:14:56.441 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 04 October 2025 19:03:39 -0400 (0:00:00.254) 0:14:56.696 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 04 October 2025 19:03:39 -0400 (0:00:00.280) 0:14:56.977 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 04 October 2025 19:03:40 -0400 (0:00:00.264) 0:14:57.241 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 04 October 2025 19:03:40 -0400 (0:00:00.286) 0:14:57.528 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 04 October 2025 19:03:40 -0400 (0:00:00.278) 0:14:57.806 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 04 October 2025 19:03:41 -0400 (0:00:00.343) 0:14:58.150 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 04 October 2025 19:03:41 -0400 (0:00:00.310) 0:14:58.461 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 04 October 2025 19:03:41 -0400 (0:00:00.588) 0:14:59.049 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 04 October 2025 19:03:42 -0400 (0:00:00.557) 0:14:59.606 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 04 October 2025 19:03:43 -0400 (0:00:00.505) 0:15:00.112 ****** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 04 October 2025 19:03:43 -0400 (0:00:00.305) 0:15:00.417 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 04 October 2025 19:03:44 -0400 (0:00:00.742) 0:15:01.160 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 04 October 2025 19:03:44 -0400 (0:00:00.560) 0:15:01.721 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 04 October 2025 19:03:45 -0400 (0:00:00.697) 0:15:02.418 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 04 October 2025 19:03:46 -0400 (0:00:00.643) 0:15:03.061 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 04 October 2025 19:03:46 -0400 (0:00:00.657) 0:15:03.719 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 04 October 2025 19:03:47 -0400 (0:00:00.667) 0:15:04.386 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 04 October 2025 19:03:49 -0400 (0:00:01.878) 0:15:06.265 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 04 October 2025 19:03:51 -0400 (0:00:02.639) 0:15:08.905 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 04 October 2025 19:03:52 -0400 (0:00:00.666) 0:15:09.571 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 04 October 2025 19:03:52 -0400 (0:00:00.466) 0:15:10.037 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 04 October 2025 19:03:53 -0400 (0:00:00.709) 0:15:10.747 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 04 October 2025 19:03:54 -0400 (0:00:00.659) 0:15:11.406 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 04 October 2025 19:03:55 -0400 (0:00:00.759) 0:15:12.166 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 04 October 2025 19:03:55 -0400 (0:00:00.711) 0:15:12.877 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 04 October 2025 19:03:56 -0400 (0:00:00.629) 0:15:13.507 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 04 October 2025 19:03:57 -0400 (0:00:00.565) 0:15:14.072 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 04 October 2025 19:03:57 -0400 (0:00:00.629) 0:15:14.702 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 04 October 2025 19:03:58 -0400 (0:00:00.677) 0:15:15.379 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 04 October 2025 19:03:59 -0400 (0:00:00.692) 0:15:16.072 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 04 October 2025 19:03:59 -0400 (0:00:00.669) 0:15:16.741 ****** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 04 October 2025 19:04:00 -0400 (0:00:00.365) 0:15:17.107 ****** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 04 October 2025 19:04:00 -0400 (0:00:00.311) 0:15:17.418 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 04 October 2025 19:04:00 -0400 (0:00:00.605) 0:15:18.024 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 04 October 2025 19:04:01 -0400 (0:00:00.282) 0:15:18.306 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 04 October 2025 19:04:01 -0400 (0:00:00.257) 0:15:18.563 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 04 October 2025 19:04:01 -0400 (0:00:00.310) 0:15:18.876 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 04 October 2025 19:04:02 -0400 (0:00:00.312) 0:15:19.189 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 04 October 2025 19:04:02 -0400 (0:00:00.300) 0:15:19.489 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 04 October 2025 19:04:02 -0400 (0:00:00.297) 0:15:19.787 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 04 October 2025 19:04:02 -0400 (0:00:00.247) 0:15:20.034 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 04 October 2025 19:04:03 -0400 (0:00:00.305) 0:15:20.339 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 04 October 2025 19:04:03 -0400 (0:00:00.645) 0:15:20.984 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:358 Saturday 04 October 2025 19:04:04 -0400 (0:00:00.326) 0:15:21.311 ****** ok: [managed-node8] => { "changed": false, "path": "/tmp/storage_testbw1r7k_7lukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key - 3] ********* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:368 Saturday 04 October 2025 19:04:05 -0400 (0:00:01.337) 0:15:22.648 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 04 October 2025 19:04:06 -0400 (0:00:00.480) 0:15:23.129 ****** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 04 October 2025 19:04:06 -0400 (0:00:00.643) 0:15:23.772 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 19:04:07 -0400 (0:00:00.577) 0:15:24.350 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 19:04:07 -0400 (0:00:00.460) 0:15:24.810 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 19:04:08 -0400 (0:00:00.665) 0:15:25.475 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 19:04:09 -0400 (0:00:00.740) 0:15:26.216 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 19:04:09 -0400 (0:00:00.264) 0:15:26.480 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 19:04:09 -0400 (0:00:00.359) 0:15:26.840 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 19:04:10 -0400 (0:00:00.275) 0:15:27.116 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 19:04:10 -0400 (0:00:00.317) 0:15:27.433 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 19:04:11 -0400 (0:00:00.863) 0:15:28.296 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 19:04:13 -0400 (0:00:02.223) 0:15:30.522 ****** ok: [managed-node8] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 19:04:14 -0400 (0:00:00.591) 0:15:31.113 ****** ok: [managed-node8] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 19:04:14 -0400 (0:00:00.743) 0:15:31.857 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 19:04:17 -0400 (0:00:02.454) 0:15:34.311 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 19:04:17 -0400 (0:00:00.637) 0:15:34.949 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 19:04:18 -0400 (0:00:00.523) 0:15:35.472 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 19:04:19 -0400 (0:00:00.593) 0:15:36.066 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 19:04:19 -0400 (0:00:00.626) 0:15:36.692 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 19:04:22 -0400 (0:00:02.545) 0:15:39.238 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 19:04:25 -0400 (0:00:02.992) 0:15:42.230 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 19:04:26 -0400 (0:00:00.949) 0:15:43.180 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 19:04:26 -0400 (0:00:00.217) 0:15:43.398 ****** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Saturday 04 October 2025 19:04:29 -0400 (0:00:02.674) 0:15:46.072 ****** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'test1' missing key/password", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 19:04:29 -0400 (0:00:00.424) 0:15:46.496 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 04 October 2025 19:04:29 -0400 (0:00:00.237) 0:15:46.734 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 04 October 2025 19:04:29 -0400 (0:00:00.315) 0:15:47.050 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 04 October 2025 19:04:30 -0400 (0:00:00.498) 0:15:47.549 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:387 Saturday 04 October 2025 19:04:30 -0400 (0:00:00.303) 0:15:47.853 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 19:04:31 -0400 (0:00:00.743) 0:15:48.596 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 19:04:32 -0400 (0:00:01.407) 0:15:50.004 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 19:04:33 -0400 (0:00:00.650) 0:15:50.655 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 19:04:34 -0400 (0:00:00.920) 0:15:51.575 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 19:04:34 -0400 (0:00:00.398) 0:15:51.974 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 19:04:35 -0400 (0:00:00.346) 0:15:52.320 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 19:04:35 -0400 (0:00:00.391) 0:15:52.712 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 19:04:35 -0400 (0:00:00.309) 0:15:53.021 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 19:04:36 -0400 (0:00:00.853) 0:15:53.875 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 19:04:39 -0400 (0:00:02.221) 0:15:56.097 ****** ok: [managed-node8] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 19:04:39 -0400 (0:00:00.810) 0:15:56.908 ****** ok: [managed-node8] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 19:04:40 -0400 (0:00:00.724) 0:15:57.632 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 19:04:43 -0400 (0:00:02.561) 0:16:00.194 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 19:04:43 -0400 (0:00:00.629) 0:16:00.823 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 19:04:44 -0400 (0:00:00.606) 0:16:01.429 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 19:04:44 -0400 (0:00:00.622) 0:16:02.052 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 19:04:45 -0400 (0:00:00.727) 0:16:02.779 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 19:04:48 -0400 (0:00:02.404) 0:16:05.184 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 19:04:51 -0400 (0:00:03.015) 0:16:08.200 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 19:04:52 -0400 (0:00:00.872) 0:16:09.072 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 19:04:52 -0400 (0:00:00.257) 0:16:09.330 ****** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Saturday 04 October 2025 19:05:04 -0400 (0:00:12.684) 0:16:22.015 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Saturday 04 October 2025 19:05:05 -0400 (0:00:00.565) 0:16:22.580 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618952.015891, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "5d293bca6d53b9ccbba2d207416f2ac8ccfccf3f", "ctime": 1759618952.013891, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 683671747, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1759618952.013891, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1448, "uid": 0, "version": "2862150521", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Saturday 04 October 2025 19:05:06 -0400 (0:00:01.296) 0:16:23.877 ****** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 19:05:08 -0400 (0:00:01.335) 0:16:25.215 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Saturday 04 October 2025 19:05:08 -0400 (0:00:00.298) 0:16:25.513 ****** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Saturday 04 October 2025 19:05:08 -0400 (0:00:00.454) 0:16:25.968 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Saturday 04 October 2025 19:05:09 -0400 (0:00:00.416) 0:16:26.385 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Saturday 04 October 2025 19:05:09 -0400 (0:00:00.389) 0:16:26.774 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': '/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Saturday 04 October 2025 19:05:11 -0400 (0:00:01.790) 0:16:28.565 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Saturday 04 October 2025 19:05:13 -0400 (0:00:01.918) 0:16:30.484 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Saturday 04 October 2025 19:05:15 -0400 (0:00:01.637) 0:16:32.122 ****** skipping: [managed-node8] => (item={'src': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Saturday 04 October 2025 19:05:15 -0400 (0:00:00.584) 0:16:32.706 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Saturday 04 October 2025 19:05:17 -0400 (0:00:01.860) 0:16:34.566 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759618967.852029, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "5aa0220c00dabf335f93280933b2c24f94e775ba", "ctime": 1759618958.300946, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 373293263, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1759618958.3016922, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 85, "uid": 0, "version": "1822865449", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Saturday 04 October 2025 19:05:18 -0400 (0:00:01.272) 0:16:35.839 ****** changed: [managed-node8] => (item={'backing_device': '/dev/sda1', 'name': 'luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node8] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Saturday 04 October 2025 19:05:21 -0400 (0:00:02.649) 0:16:38.489 ****** ok: [managed-node8] TASK [Verify role results - 7] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:406 Saturday 04 October 2025 19:05:23 -0400 (0:00:02.211) 0:16:40.700 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 04 October 2025 19:05:24 -0400 (0:00:00.549) 0:16:41.250 ****** ok: [managed-node8] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 04 October 2025 19:05:25 -0400 (0:00:00.814) 0:16:42.064 ****** skipping: [managed-node8] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 04 October 2025 19:05:25 -0400 (0:00:00.660) 0:16:42.725 ****** ok: [managed-node8] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "5a76e9d8-506a-42b0-8014-dbfe2d14193b" }, "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "size": "4G", "type": "crypt", "uuid": "f80f4ca6-7053-4ecc-9803-8dddf7ef0849" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "4dp1Fv-bvao-1UZJ-Uroo-RdD2-Lzni-JhB8EB" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "8f91d125-a2a0-4395-9c60-8981f3d516e2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 04 October 2025 19:05:26 -0400 (0:00:01.292) 0:16:44.018 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002670", "end": "2025-10-04 19:05:27.896032", "rc": 0, "start": "2025-10-04 19:05:27.893362" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Sep 30 13:43:10 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=8f91d125-a2a0-4395-9c60-8981f3d516e2 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 04 October 2025 19:05:28 -0400 (0:00:01.136) 0:16:45.155 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:01.003626", "end": "2025-10-04 19:05:30.090969", "failed_when_result": false, "rc": 0, "start": "2025-10-04 19:05:29.087343" } STDOUT: luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 04 October 2025 19:05:30 -0400 (0:00:02.263) 0:16:47.418 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node8 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 04 October 2025 19:05:31 -0400 (0:00:00.939) 0:16:48.358 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 04 October 2025 19:05:31 -0400 (0:00:00.317) 0:16:48.676 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.029332", "end": "2025-10-04 19:05:32.718245", "rc": 0, "start": "2025-10-04 19:05:32.688913" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 04 October 2025 19:05:32 -0400 (0:00:01.375) 0:16:50.051 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 04 October 2025 19:05:33 -0400 (0:00:00.400) 0:16:50.452 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node8 => (item=members) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node8 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 04 October 2025 19:05:34 -0400 (0:00:00.839) 0:16:51.291 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 04 October 2025 19:05:35 -0400 (0:00:00.799) 0:16:52.090 ****** ok: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 04 October 2025 19:05:38 -0400 (0:00:03.811) 0:16:55.902 ****** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 04 October 2025 19:05:39 -0400 (0:00:00.660) 0:16:56.562 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 04 October 2025 19:05:40 -0400 (0:00:00.684) 0:16:57.247 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 04 October 2025 19:05:40 -0400 (0:00:00.791) 0:16:58.038 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 04 October 2025 19:05:41 -0400 (0:00:00.321) 0:16:58.360 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 04 October 2025 19:05:42 -0400 (0:00:00.740) 0:16:59.100 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Saturday 04 October 2025 19:05:42 -0400 (0:00:00.424) 0:16:59.525 ****** ok: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Saturday 04 October 2025 19:05:43 -0400 (0:00:00.532) 0:17:00.058 ****** ok: [managed-node8] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:82444): WARNING **: 19:05:43.959: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.173 originally 10.31.44.173 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.173 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.173 originally 10.31.44.173 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6fa89a8227' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.173 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Saturday 04 October 2025 19:05:44 -0400 (0:00:01.340) 0:17:01.398 ****** skipping: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Saturday 04 October 2025 19:05:45 -0400 (0:00:00.720) 0:17:02.119 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node8 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 04 October 2025 19:05:46 -0400 (0:00:01.760) 0:17:03.880 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 04 October 2025 19:05:47 -0400 (0:00:00.325) 0:17:04.206 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 04 October 2025 19:05:47 -0400 (0:00:00.292) 0:17:04.499 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 04 October 2025 19:05:47 -0400 (0:00:00.270) 0:17:04.769 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 04 October 2025 19:05:48 -0400 (0:00:00.319) 0:17:05.089 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 04 October 2025 19:05:48 -0400 (0:00:00.333) 0:17:05.422 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 04 October 2025 19:05:48 -0400 (0:00:00.267) 0:17:05.689 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 04 October 2025 19:05:48 -0400 (0:00:00.300) 0:17:05.990 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 04 October 2025 19:05:49 -0400 (0:00:00.281) 0:17:06.272 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 04 October 2025 19:05:49 -0400 (0:00:00.258) 0:17:06.530 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 04 October 2025 19:05:49 -0400 (0:00:00.289) 0:17:06.820 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Saturday 04 October 2025 19:05:50 -0400 (0:00:00.330) 0:17:07.151 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node8 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 04 October 2025 19:05:50 -0400 (0:00:00.676) 0:17:07.827 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node8 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Saturday 04 October 2025 19:05:51 -0400 (0:00:00.712) 0:17:08.540 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Saturday 04 October 2025 19:05:51 -0400 (0:00:00.342) 0:17:08.883 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Saturday 04 October 2025 19:05:52 -0400 (0:00:00.415) 0:17:09.299 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Saturday 04 October 2025 19:05:52 -0400 (0:00:00.368) 0:17:09.667 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Saturday 04 October 2025 19:05:53 -0400 (0:00:00.412) 0:17:10.079 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Saturday 04 October 2025 19:05:53 -0400 (0:00:00.352) 0:17:10.432 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Saturday 04 October 2025 19:05:53 -0400 (0:00:00.304) 0:17:10.737 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Saturday 04 October 2025 19:05:54 -0400 (0:00:00.374) 0:17:11.111 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node8 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 04 October 2025 19:05:54 -0400 (0:00:00.714) 0:17:11.826 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node8 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Saturday 04 October 2025 19:05:55 -0400 (0:00:00.570) 0:17:12.396 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Saturday 04 October 2025 19:05:55 -0400 (0:00:00.320) 0:17:12.717 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Saturday 04 October 2025 19:05:55 -0400 (0:00:00.281) 0:17:12.998 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Saturday 04 October 2025 19:05:56 -0400 (0:00:00.304) 0:17:13.302 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Saturday 04 October 2025 19:05:56 -0400 (0:00:00.360) 0:17:13.663 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 04 October 2025 19:05:57 -0400 (0:00:00.807) 0:17:14.474 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 04 October 2025 19:05:58 -0400 (0:00:00.685) 0:17:15.159 ****** skipping: [managed-node8] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 04 October 2025 19:05:58 -0400 (0:00:00.385) 0:17:15.545 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node8 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Saturday 04 October 2025 19:05:59 -0400 (0:00:00.578) 0:17:16.124 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Saturday 04 October 2025 19:05:59 -0400 (0:00:00.729) 0:17:16.853 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Saturday 04 October 2025 19:06:00 -0400 (0:00:00.762) 0:17:17.616 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Saturday 04 October 2025 19:06:01 -0400 (0:00:00.476) 0:17:18.092 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Saturday 04 October 2025 19:06:01 -0400 (0:00:00.714) 0:17:18.807 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Saturday 04 October 2025 19:06:02 -0400 (0:00:00.698) 0:17:19.505 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 04 October 2025 19:06:02 -0400 (0:00:00.280) 0:17:19.786 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Saturday 04 October 2025 19:06:03 -0400 (0:00:00.309) 0:17:20.095 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node8 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 04 October 2025 19:06:03 -0400 (0:00:00.874) 0:17:20.970 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node8 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Saturday 04 October 2025 19:06:04 -0400 (0:00:00.701) 0:17:21.671 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Saturday 04 October 2025 19:06:04 -0400 (0:00:00.346) 0:17:22.017 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Saturday 04 October 2025 19:06:05 -0400 (0:00:00.316) 0:17:22.333 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Saturday 04 October 2025 19:06:05 -0400 (0:00:00.299) 0:17:22.633 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Saturday 04 October 2025 19:06:05 -0400 (0:00:00.261) 0:17:22.894 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Saturday 04 October 2025 19:06:06 -0400 (0:00:00.276) 0:17:23.170 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Saturday 04 October 2025 19:06:06 -0400 (0:00:00.283) 0:17:23.454 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Saturday 04 October 2025 19:06:06 -0400 (0:00:00.359) 0:17:23.813 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node8 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 04 October 2025 19:06:07 -0400 (0:00:00.906) 0:17:24.719 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 04 October 2025 19:06:07 -0400 (0:00:00.317) 0:17:25.037 ****** skipping: [managed-node8] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Saturday 04 October 2025 19:06:08 -0400 (0:00:00.262) 0:17:25.300 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Saturday 04 October 2025 19:06:08 -0400 (0:00:00.304) 0:17:25.604 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Saturday 04 October 2025 19:06:08 -0400 (0:00:00.307) 0:17:25.912 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Saturday 04 October 2025 19:06:09 -0400 (0:00:00.259) 0:17:26.171 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Saturday 04 October 2025 19:06:09 -0400 (0:00:00.304) 0:17:26.475 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Saturday 04 October 2025 19:06:09 -0400 (0:00:00.318) 0:17:26.794 ****** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 04 October 2025 19:06:10 -0400 (0:00:00.275) 0:17:27.069 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 04 October 2025 19:06:10 -0400 (0:00:00.528) 0:17:27.598 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 04 October 2025 19:06:11 -0400 (0:00:00.567) 0:17:28.165 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 => (item=mount) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 => (item=fstab) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 => (item=fs) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 => (item=device) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 => (item=encryption) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 => (item=md) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 => (item=size) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 04 October 2025 19:06:14 -0400 (0:00:03.032) 0:17:31.197 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 04 October 2025 19:06:14 -0400 (0:00:00.344) 0:17:31.541 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 04 October 2025 19:06:15 -0400 (0:00:00.653) 0:17:32.195 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 04 October 2025 19:06:15 -0400 (0:00:00.674) 0:17:32.870 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 04 October 2025 19:06:16 -0400 (0:00:00.439) 0:17:33.310 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 04 October 2025 19:06:16 -0400 (0:00:00.697) 0:17:34.007 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 04 October 2025 19:06:17 -0400 (0:00:00.720) 0:17:34.728 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 04 October 2025 19:06:18 -0400 (0:00:00.724) 0:17:35.452 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 04 October 2025 19:06:18 -0400 (0:00:00.254) 0:17:35.707 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 04 October 2025 19:06:18 -0400 (0:00:00.272) 0:17:35.980 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 04 October 2025 19:06:19 -0400 (0:00:00.383) 0:17:36.383 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 04 October 2025 19:06:19 -0400 (0:00:00.321) 0:17:36.705 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 04 October 2025 19:06:20 -0400 (0:00:01.003) 0:17:37.708 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 04 October 2025 19:06:21 -0400 (0:00:00.789) 0:17:38.497 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 04 October 2025 19:06:22 -0400 (0:00:00.679) 0:17:39.177 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 04 October 2025 19:06:22 -0400 (0:00:00.619) 0:17:39.796 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 04 October 2025 19:06:23 -0400 (0:00:00.818) 0:17:40.615 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 04 October 2025 19:06:23 -0400 (0:00:00.362) 0:17:40.977 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 04 October 2025 19:06:24 -0400 (0:00:00.715) 0:17:41.692 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 04 October 2025 19:06:25 -0400 (0:00:00.684) 0:17:42.377 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619104.3622193, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759619104.3622193, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1914, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1759619104.3622193, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 04 October 2025 19:06:26 -0400 (0:00:01.319) 0:17:43.696 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 04 October 2025 19:06:27 -0400 (0:00:00.411) 0:17:44.108 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 04 October 2025 19:06:27 -0400 (0:00:00.311) 0:17:44.419 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 04 October 2025 19:06:27 -0400 (0:00:00.409) 0:17:44.828 ****** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 04 October 2025 19:06:28 -0400 (0:00:00.341) 0:17:45.170 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 04 October 2025 19:06:28 -0400 (0:00:00.287) 0:17:45.457 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 04 October 2025 19:06:28 -0400 (0:00:00.295) 0:17:45.753 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619104.6052215, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759619104.6052215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1962, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1759619104.6052215, "nlink": 1, "path": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 04 October 2025 19:06:29 -0400 (0:00:01.160) 0:17:46.913 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 04 October 2025 19:06:32 -0400 (0:00:02.326) 0:17:49.240 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.007062", "end": "2025-10-04 19:06:33.167162", "rc": 0, "start": "2025-10-04 19:06:33.160100" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 5a76e9d8-506a-42b0-8014-dbfe2d14193b Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 693496 Threads: 2 Salt: 9e 6c c5 2a 03 c8 fb b3 e9 95 4e 24 0f ab d2 00 9c 9e 2e a8 a7 3d aa d5 65 30 a7 78 2c 20 ba 25 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 131730 Salt: ef b1 1f 1b 0e 50 55 86 54 55 e8 ba 55 ba d5 e9 34 8c 32 f5 66 6b 17 6d 40 c7 3d 3c c5 38 82 2a Digest: 2c b5 5b e0 2e 25 96 b0 92 2c fa 14 8d 29 d8 e8 bf 56 4f 8d a9 5f 7a a2 34 0e c2 ac 31 1e 6a db TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 04 October 2025 19:06:33 -0400 (0:00:01.292) 0:17:50.532 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 04 October 2025 19:06:34 -0400 (0:00:00.717) 0:17:51.249 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 04 October 2025 19:06:34 -0400 (0:00:00.572) 0:17:51.821 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 04 October 2025 19:06:35 -0400 (0:00:00.321) 0:17:52.143 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 04 October 2025 19:06:35 -0400 (0:00:00.372) 0:17:52.515 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 04 October 2025 19:06:36 -0400 (0:00:01.019) 0:17:53.535 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 04 October 2025 19:06:37 -0400 (0:00:00.948) 0:17:54.483 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 04 October 2025 19:06:38 -0400 (0:00:00.791) 0:17:55.275 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 04 October 2025 19:06:39 -0400 (0:00:00.862) 0:17:56.137 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 04 October 2025 19:06:39 -0400 (0:00:00.593) 0:17:56.730 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 04 October 2025 19:06:40 -0400 (0:00:00.750) 0:17:57.481 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 04 October 2025 19:06:41 -0400 (0:00:00.808) 0:17:58.289 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 04 October 2025 19:06:41 -0400 (0:00:00.554) 0:17:58.844 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 04 October 2025 19:06:41 -0400 (0:00:00.141) 0:17:58.985 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 04 October 2025 19:06:42 -0400 (0:00:00.220) 0:17:59.206 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 04 October 2025 19:06:42 -0400 (0:00:00.234) 0:17:59.441 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 04 October 2025 19:06:42 -0400 (0:00:00.312) 0:17:59.754 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 04 October 2025 19:06:43 -0400 (0:00:00.388) 0:18:00.142 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 04 October 2025 19:06:43 -0400 (0:00:00.355) 0:18:00.498 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 04 October 2025 19:06:43 -0400 (0:00:00.219) 0:18:00.718 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 04 October 2025 19:06:43 -0400 (0:00:00.274) 0:18:00.993 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 04 October 2025 19:06:44 -0400 (0:00:00.224) 0:18:01.217 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 04 October 2025 19:06:44 -0400 (0:00:00.208) 0:18:01.426 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 04 October 2025 19:06:44 -0400 (0:00:00.341) 0:18:01.767 ****** ok: [managed-node8] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 04 October 2025 19:06:47 -0400 (0:00:02.944) 0:18:04.712 ****** ok: [managed-node8] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 04 October 2025 19:06:49 -0400 (0:00:01.448) 0:18:06.161 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 04 October 2025 19:06:49 -0400 (0:00:00.665) 0:18:06.826 ****** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 04 October 2025 19:06:50 -0400 (0:00:00.281) 0:18:07.108 ****** ok: [managed-node8] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 04 October 2025 19:06:51 -0400 (0:00:01.264) 0:18:08.373 ****** skipping: [managed-node8] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 04 October 2025 19:06:51 -0400 (0:00:00.678) 0:18:09.051 ****** skipping: [managed-node8] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 04 October 2025 19:06:52 -0400 (0:00:00.670) 0:18:09.722 ****** skipping: [managed-node8] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 04 October 2025 19:06:53 -0400 (0:00:00.546) 0:18:10.269 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 04 October 2025 19:06:53 -0400 (0:00:00.625) 0:18:10.894 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 04 October 2025 19:06:54 -0400 (0:00:00.646) 0:18:11.541 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 04 October 2025 19:06:55 -0400 (0:00:00.786) 0:18:12.327 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 04 October 2025 19:06:56 -0400 (0:00:00.743) 0:18:13.071 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 04 October 2025 19:06:56 -0400 (0:00:00.599) 0:18:13.670 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 04 October 2025 19:06:57 -0400 (0:00:00.720) 0:18:14.391 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 04 October 2025 19:06:57 -0400 (0:00:00.618) 0:18:15.010 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 04 October 2025 19:06:58 -0400 (0:00:00.427) 0:18:15.437 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 04 October 2025 19:06:59 -0400 (0:00:00.625) 0:18:16.063 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 04 October 2025 19:06:59 -0400 (0:00:00.740) 0:18:16.803 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 04 October 2025 19:07:00 -0400 (0:00:00.787) 0:18:17.590 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 04 October 2025 19:07:01 -0400 (0:00:00.549) 0:18:18.140 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 04 October 2025 19:07:01 -0400 (0:00:00.756) 0:18:18.897 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 04 October 2025 19:07:02 -0400 (0:00:00.783) 0:18:19.680 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 04 October 2025 19:07:03 -0400 (0:00:00.685) 0:18:20.366 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 04 October 2025 19:07:04 -0400 (0:00:00.769) 0:18:21.135 ****** ok: [managed-node8] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 04 October 2025 19:07:05 -0400 (0:00:01.123) 0:18:22.259 ****** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 04 October 2025 19:07:05 -0400 (0:00:00.251) 0:18:22.510 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 04 October 2025 19:07:06 -0400 (0:00:00.723) 0:18:23.234 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.030167", "end": "2025-10-04 19:07:07.325190", "rc": 0, "start": "2025-10-04 19:07:07.295023" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 04 October 2025 19:07:07 -0400 (0:00:01.450) 0:18:24.684 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 04 October 2025 19:07:08 -0400 (0:00:00.695) 0:18:25.379 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 04 October 2025 19:07:08 -0400 (0:00:00.664) 0:18:26.044 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 04 October 2025 19:07:09 -0400 (0:00:00.772) 0:18:26.816 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 04 October 2025 19:07:10 -0400 (0:00:00.628) 0:18:27.445 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 04 October 2025 19:07:10 -0400 (0:00:00.610) 0:18:28.055 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 04 October 2025 19:07:11 -0400 (0:00:00.585) 0:18:28.641 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 04 October 2025 19:07:11 -0400 (0:00:00.289) 0:18:28.930 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 04 October 2025 19:07:12 -0400 (0:00:00.626) 0:18:29.557 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:409 Saturday 04 October 2025 19:07:12 -0400 (0:00:00.413) 0:18:29.971 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 19:07:13 -0400 (0:00:00.976) 0:18:30.948 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 19:07:14 -0400 (0:00:00.519) 0:18:31.467 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 19:07:15 -0400 (0:00:00.658) 0:18:32.126 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 19:07:15 -0400 (0:00:00.797) 0:18:32.924 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 19:07:16 -0400 (0:00:00.379) 0:18:33.303 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 19:07:16 -0400 (0:00:00.428) 0:18:33.732 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 19:07:17 -0400 (0:00:00.356) 0:18:34.089 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 19:07:17 -0400 (0:00:00.346) 0:18:34.435 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 19:07:18 -0400 (0:00:00.803) 0:18:35.239 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 19:07:20 -0400 (0:00:02.560) 0:18:37.799 ****** ok: [managed-node8] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 19:07:21 -0400 (0:00:00.718) 0:18:38.518 ****** ok: [managed-node8] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 19:07:22 -0400 (0:00:00.688) 0:18:39.207 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 19:07:24 -0400 (0:00:02.663) 0:18:41.870 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 19:07:25 -0400 (0:00:00.605) 0:18:42.476 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 19:07:25 -0400 (0:00:00.557) 0:18:43.034 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 19:07:26 -0400 (0:00:00.656) 0:18:43.690 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 19:07:27 -0400 (0:00:00.779) 0:18:44.469 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 19:07:29 -0400 (0:00:02.499) 0:18:46.969 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d19bbcb22\\x2d6ec7\\x2d4d41\\x2d91bc\\x2d3354e1a43d49.service": { "name": "systemd-cryptsetup@luks\\x2d19bbcb22\\x2d6ec7\\x2d4d41\\x2d91bc\\x2d3354e1a43d49.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 19:07:33 -0400 (0:00:03.106) 0:18:50.075 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d19bbcb22\\x2d6ec7\\x2d4d41\\x2d91bc\\x2d3354e1a43d49.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 19:07:34 -0400 (0:00:00.994) 0:18:51.069 ****** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d19bbcb22\x2d6ec7\x2d4d41\x2d91bc\x2d3354e1a43d49.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d19bbcb22\\x2d6ec7\\x2d4d41\\x2d91bc\\x2d3354e1a43d49.service", "name": "systemd-cryptsetup@luks\\x2d19bbcb22\\x2d6ec7\\x2d4d41\\x2d91bc\\x2d3354e1a43d49.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "tmp.mount \"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket systemd-udevd-kernel.socket cryptsetup-pre.target dev-sda1.device -.mount", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target \"blockdev@dev-mapper-luks\\\\x2d19bbcb22\\\\x2d6ec7\\\\x2d4d41\\\\x2d91bc\\\\x2d3354e1a43d49.target\" cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49 /dev/sda1 /tmp/storage_testbw1r7k_7lukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49 /dev/sda1 /tmp/storage_testbw1r7k_7lukskey ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-19bbcb22-6ec7-4d41-91bc-3354e1a43d49 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d19bbcb22\\x2d6ec7\\x2d4d41\\x2d91bc\\x2d3354e1a43d49.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d19bbcb22\\x2d6ec7\\x2d4d41\\x2d91bc\\x2d3354e1a43d49.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13686", "LimitNPROCSoft": "13686", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13686", "LimitSIGPENDINGSoft": "13686", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d19bbcb22\\\\x2d6ec7\\\\x2d4d41\\\\x2d91bc\\\\x2d3354e1a43d49.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "-.mount \"system-systemd\\\\x2dcryptsetup.slice\"", "RequiresMountsFor": "/tmp/storage_testbw1r7k_7lukskey", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2025-10-04 19:05:17 EDT", "StateChangeTimestampMonotonic": "2666560183", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21897", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d19bbcb22\\\\x2d6ec7\\\\x2d4d41\\\\x2d91bc\\\\x2d3354e1a43d49.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 19:07:35 -0400 (0:00:01.660) 0:18:52.730 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Saturday 04 October 2025 19:07:38 -0400 (0:00:02.674) 0:18:55.405 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Saturday 04 October 2025 19:07:38 -0400 (0:00:00.602) 0:18:56.007 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619114.8183105, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "94299e3a6316e29f0fe5278f78e2734c255871dd", "ctime": 1759619114.8153105, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 683671747, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1759619114.8153105, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1448, "uid": 0, "version": "2862150521", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Saturday 04 October 2025 19:07:40 -0400 (0:00:01.279) 0:18:57.287 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "blivet_output is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 19:07:40 -0400 (0:00:00.322) 0:18:57.609 ****** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d19bbcb22\x2d6ec7\x2d4d41\x2d91bc\x2d3354e1a43d49.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d19bbcb22\\x2d6ec7\\x2d4d41\\x2d91bc\\x2d3354e1a43d49.service", "name": "systemd-cryptsetup@luks\\x2d19bbcb22\\x2d6ec7\\x2d4d41\\x2d91bc\\x2d3354e1a43d49.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d19bbcb22\\x2d6ec7\\x2d4d41\\x2d91bc\\x2d3354e1a43d49.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d19bbcb22\\x2d6ec7\\x2d4d41\\x2d91bc\\x2d3354e1a43d49.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d19bbcb22\\x2d6ec7\\x2d4d41\\x2d91bc\\x2d3354e1a43d49.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454148096", "LimitMEMLOCKSoft": "454148096", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13686", "LimitNPROCSoft": "13686", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13686", "LimitSIGPENDINGSoft": "13686", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d19bbcb22\\x2d6ec7\\x2d4d41\\x2d91bc\\x2d3354e1a43d49.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d19bbcb22\\\\x2d6ec7\\\\x2d4d41\\\\x2d91bc\\\\x2d3354e1a43d49.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21897", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Saturday 04 October 2025 19:07:42 -0400 (0:00:01.575) 0:18:59.185 ****** ok: [managed-node8] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Saturday 04 October 2025 19:07:42 -0400 (0:00:00.430) 0:18:59.616 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Saturday 04 October 2025 19:07:42 -0400 (0:00:00.354) 0:18:59.970 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Saturday 04 October 2025 19:07:43 -0400 (0:00:00.312) 0:19:00.283 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Saturday 04 October 2025 19:07:44 -0400 (0:00:00.779) 0:19:01.063 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Saturday 04 October 2025 19:07:45 -0400 (0:00:01.917) 0:19:02.980 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount ok: [managed-node8] => (item={'src': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Saturday 04 October 2025 19:07:47 -0400 (0:00:01.856) 0:19:04.837 ****** skipping: [managed-node8] => (item={'src': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Saturday 04 October 2025 19:07:48 -0400 (0:00:00.760) 0:19:05.597 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Saturday 04 October 2025 19:07:50 -0400 (0:00:01.882) 0:19:07.480 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619129.088435, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "41dd92bc6da332a68d463b27bdd167c271fc924b", "ctime": 1759619121.2163663, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 658505925, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1759619121.2174883, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3090432077", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Saturday 04 October 2025 19:07:51 -0400 (0:00:01.200) 0:19:08.681 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Saturday 04 October 2025 19:07:51 -0400 (0:00:00.309) 0:19:08.990 ****** ok: [managed-node8] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:423 Saturday 04 October 2025 19:07:54 -0400 (0:00:02.175) 0:19:11.166 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify role results - 8] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:430 Saturday 04 October 2025 19:07:54 -0400 (0:00:00.383) 0:19:11.550 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 04 October 2025 19:07:55 -0400 (0:00:00.694) 0:19:12.245 ****** ok: [managed-node8] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 04 October 2025 19:07:55 -0400 (0:00:00.756) 0:19:13.001 ****** skipping: [managed-node8] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 04 October 2025 19:07:56 -0400 (0:00:00.632) 0:19:13.634 ****** ok: [managed-node8] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "5a76e9d8-506a-42b0-8014-dbfe2d14193b" }, "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "size": "4G", "type": "crypt", "uuid": "f80f4ca6-7053-4ecc-9803-8dddf7ef0849" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "4dp1Fv-bvao-1UZJ-Uroo-RdD2-Lzni-JhB8EB" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "8f91d125-a2a0-4395-9c60-8981f3d516e2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 04 October 2025 19:07:57 -0400 (0:00:01.244) 0:19:14.879 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003098", "end": "2025-10-04 19:07:58.781654", "rc": 0, "start": "2025-10-04 19:07:58.778556" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Sep 30 13:43:10 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=8f91d125-a2a0-4395-9c60-8981f3d516e2 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 04 October 2025 19:07:59 -0400 (0:00:01.194) 0:19:16.073 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003530", "end": "2025-10-04 19:08:00.061696", "failed_when_result": false, "rc": 0, "start": "2025-10-04 19:08:00.058166" } STDOUT: luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 04 October 2025 19:08:00 -0400 (0:00:01.295) 0:19:17.369 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node8 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 04 October 2025 19:08:03 -0400 (0:00:02.904) 0:19:20.273 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 04 October 2025 19:08:03 -0400 (0:00:00.334) 0:19:20.608 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.029124", "end": "2025-10-04 19:08:04.572134", "rc": 0, "start": "2025-10-04 19:08:04.543010" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 04 October 2025 19:08:04 -0400 (0:00:01.285) 0:19:21.894 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 04 October 2025 19:08:05 -0400 (0:00:00.433) 0:19:22.328 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node8 => (item=members) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node8 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 04 October 2025 19:08:05 -0400 (0:00:00.703) 0:19:23.031 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 04 October 2025 19:08:06 -0400 (0:00:00.763) 0:19:23.795 ****** ok: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 04 October 2025 19:08:08 -0400 (0:00:01.319) 0:19:25.114 ****** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 04 October 2025 19:08:08 -0400 (0:00:00.676) 0:19:25.791 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 04 October 2025 19:08:09 -0400 (0:00:00.703) 0:19:26.494 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 04 October 2025 19:08:10 -0400 (0:00:00.666) 0:19:27.161 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 04 October 2025 19:08:10 -0400 (0:00:00.473) 0:19:27.635 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 04 October 2025 19:08:11 -0400 (0:00:00.731) 0:19:28.366 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Saturday 04 October 2025 19:08:11 -0400 (0:00:00.430) 0:19:28.797 ****** ok: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Saturday 04 October 2025 19:08:12 -0400 (0:00:00.502) 0:19:29.299 ****** ok: [managed-node8] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:86876): WARNING **: 19:08:13.138: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.173 originally 10.31.44.173 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.173 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.173 originally 10.31.44.173 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6fa89a8227' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.173 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Saturday 04 October 2025 19:08:13 -0400 (0:00:01.257) 0:19:30.557 ****** skipping: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Saturday 04 October 2025 19:08:14 -0400 (0:00:00.604) 0:19:31.162 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node8 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 04 October 2025 19:08:14 -0400 (0:00:00.773) 0:19:31.936 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 04 October 2025 19:08:15 -0400 (0:00:00.292) 0:19:32.229 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 04 October 2025 19:08:15 -0400 (0:00:00.331) 0:19:32.560 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 04 October 2025 19:08:15 -0400 (0:00:00.289) 0:19:32.849 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 04 October 2025 19:08:16 -0400 (0:00:00.312) 0:19:33.162 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 04 October 2025 19:08:16 -0400 (0:00:00.280) 0:19:33.443 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 04 October 2025 19:08:16 -0400 (0:00:00.293) 0:19:33.736 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 04 October 2025 19:08:16 -0400 (0:00:00.288) 0:19:34.025 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 04 October 2025 19:08:17 -0400 (0:00:00.361) 0:19:34.386 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 04 October 2025 19:08:17 -0400 (0:00:00.325) 0:19:34.711 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 04 October 2025 19:08:17 -0400 (0:00:00.306) 0:19:35.018 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Saturday 04 October 2025 19:08:18 -0400 (0:00:00.374) 0:19:35.392 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node8 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 04 October 2025 19:08:19 -0400 (0:00:00.734) 0:19:36.126 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node8 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Saturday 04 October 2025 19:08:19 -0400 (0:00:00.591) 0:19:36.718 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Saturday 04 October 2025 19:08:19 -0400 (0:00:00.310) 0:19:37.029 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Saturday 04 October 2025 19:08:20 -0400 (0:00:00.267) 0:19:37.296 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Saturday 04 October 2025 19:08:20 -0400 (0:00:00.243) 0:19:37.540 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Saturday 04 October 2025 19:08:20 -0400 (0:00:00.451) 0:19:37.991 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Saturday 04 October 2025 19:08:21 -0400 (0:00:00.490) 0:19:38.482 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Saturday 04 October 2025 19:08:21 -0400 (0:00:00.418) 0:19:38.900 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Saturday 04 October 2025 19:08:22 -0400 (0:00:00.299) 0:19:39.199 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node8 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 04 October 2025 19:08:22 -0400 (0:00:00.733) 0:19:39.933 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node8 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Saturday 04 October 2025 19:08:23 -0400 (0:00:00.620) 0:19:40.554 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Saturday 04 October 2025 19:08:23 -0400 (0:00:00.297) 0:19:40.851 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Saturday 04 October 2025 19:08:24 -0400 (0:00:00.322) 0:19:41.174 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Saturday 04 October 2025 19:08:24 -0400 (0:00:00.281) 0:19:41.456 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Saturday 04 October 2025 19:08:24 -0400 (0:00:00.402) 0:19:41.858 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 04 October 2025 19:08:25 -0400 (0:00:00.781) 0:19:42.639 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 04 October 2025 19:08:26 -0400 (0:00:00.641) 0:19:43.280 ****** skipping: [managed-node8] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 04 October 2025 19:08:26 -0400 (0:00:00.292) 0:19:43.573 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node8 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Saturday 04 October 2025 19:08:27 -0400 (0:00:00.630) 0:19:44.203 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Saturday 04 October 2025 19:08:27 -0400 (0:00:00.722) 0:19:44.925 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Saturday 04 October 2025 19:08:28 -0400 (0:00:00.644) 0:19:45.570 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Saturday 04 October 2025 19:08:29 -0400 (0:00:00.591) 0:19:46.161 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Saturday 04 October 2025 19:08:29 -0400 (0:00:00.584) 0:19:46.746 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Saturday 04 October 2025 19:08:30 -0400 (0:00:00.712) 0:19:47.459 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 04 October 2025 19:08:30 -0400 (0:00:00.350) 0:19:47.809 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Saturday 04 October 2025 19:08:31 -0400 (0:00:00.287) 0:19:48.096 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node8 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 04 October 2025 19:08:31 -0400 (0:00:00.748) 0:19:48.859 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node8 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Saturday 04 October 2025 19:08:32 -0400 (0:00:00.719) 0:19:49.579 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Saturday 04 October 2025 19:08:32 -0400 (0:00:00.224) 0:19:49.803 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Saturday 04 October 2025 19:08:33 -0400 (0:00:00.297) 0:19:50.101 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Saturday 04 October 2025 19:08:33 -0400 (0:00:00.263) 0:19:50.364 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Saturday 04 October 2025 19:08:33 -0400 (0:00:00.349) 0:19:50.714 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Saturday 04 October 2025 19:08:35 -0400 (0:00:01.397) 0:19:52.111 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Saturday 04 October 2025 19:08:35 -0400 (0:00:00.306) 0:19:52.418 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Saturday 04 October 2025 19:08:35 -0400 (0:00:00.298) 0:19:52.716 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node8 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 04 October 2025 19:08:36 -0400 (0:00:01.025) 0:19:53.742 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 04 October 2025 19:08:36 -0400 (0:00:00.306) 0:19:54.049 ****** skipping: [managed-node8] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Saturday 04 October 2025 19:08:37 -0400 (0:00:00.294) 0:19:54.343 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Saturday 04 October 2025 19:08:37 -0400 (0:00:00.284) 0:19:54.627 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Saturday 04 October 2025 19:08:37 -0400 (0:00:00.330) 0:19:54.957 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Saturday 04 October 2025 19:08:38 -0400 (0:00:00.252) 0:19:55.210 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Saturday 04 October 2025 19:08:38 -0400 (0:00:00.324) 0:19:55.534 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Saturday 04 October 2025 19:08:38 -0400 (0:00:00.359) 0:19:55.893 ****** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 04 October 2025 19:08:39 -0400 (0:00:00.295) 0:19:56.189 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 04 October 2025 19:08:39 -0400 (0:00:00.553) 0:19:56.755 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 04 October 2025 19:08:40 -0400 (0:00:00.721) 0:19:57.476 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 => (item=mount) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 => (item=fstab) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 => (item=fs) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 => (item=device) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 => (item=encryption) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 => (item=md) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 => (item=size) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 04 October 2025 19:08:42 -0400 (0:00:01.862) 0:19:59.339 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 04 October 2025 19:08:42 -0400 (0:00:00.417) 0:19:59.757 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 04 October 2025 19:08:43 -0400 (0:00:00.733) 0:20:00.490 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 04 October 2025 19:08:44 -0400 (0:00:00.776) 0:20:01.267 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 04 October 2025 19:08:44 -0400 (0:00:00.391) 0:20:01.658 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 04 October 2025 19:08:45 -0400 (0:00:00.783) 0:20:02.442 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 04 October 2025 19:08:46 -0400 (0:00:00.766) 0:20:03.208 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 04 October 2025 19:08:46 -0400 (0:00:00.842) 0:20:04.051 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 04 October 2025 19:08:47 -0400 (0:00:00.349) 0:20:04.400 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 04 October 2025 19:08:47 -0400 (0:00:00.306) 0:20:04.706 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 04 October 2025 19:08:47 -0400 (0:00:00.258) 0:20:04.965 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 04 October 2025 19:08:48 -0400 (0:00:00.302) 0:20:05.268 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 04 October 2025 19:08:49 -0400 (0:00:00.960) 0:20:06.228 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 04 October 2025 19:08:49 -0400 (0:00:00.723) 0:20:06.952 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 04 October 2025 19:08:50 -0400 (0:00:00.861) 0:20:07.814 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 04 October 2025 19:08:51 -0400 (0:00:00.603) 0:20:08.417 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 04 October 2025 19:08:52 -0400 (0:00:00.658) 0:20:09.076 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 04 October 2025 19:08:52 -0400 (0:00:00.370) 0:20:09.446 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 04 October 2025 19:08:53 -0400 (0:00:00.774) 0:20:10.220 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 04 October 2025 19:08:53 -0400 (0:00:00.812) 0:20:11.033 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619193.1659932, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759619104.3622193, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1914, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1759619104.3622193, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 04 October 2025 19:08:55 -0400 (0:00:01.454) 0:20:12.487 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 04 October 2025 19:08:55 -0400 (0:00:00.398) 0:20:12.885 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 04 October 2025 19:08:56 -0400 (0:00:00.279) 0:20:13.165 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 04 October 2025 19:08:56 -0400 (0:00:00.346) 0:20:13.511 ****** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 04 October 2025 19:08:56 -0400 (0:00:00.410) 0:20:13.922 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 04 October 2025 19:08:57 -0400 (0:00:00.313) 0:20:14.235 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 04 October 2025 19:08:57 -0400 (0:00:00.435) 0:20:14.671 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619258.0295582, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759619104.6052215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1962, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1759619104.6052215, "nlink": 1, "path": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 04 October 2025 19:08:58 -0400 (0:00:01.256) 0:20:15.927 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 04 October 2025 19:09:01 -0400 (0:00:02.271) 0:20:18.199 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.007045", "end": "2025-10-04 19:09:02.159425", "rc": 0, "start": "2025-10-04 19:09:02.152380" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 5a76e9d8-506a-42b0-8014-dbfe2d14193b Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 693496 Threads: 2 Salt: 9e 6c c5 2a 03 c8 fb b3 e9 95 4e 24 0f ab d2 00 9c 9e 2e a8 a7 3d aa d5 65 30 a7 78 2c 20 ba 25 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 131730 Salt: ef b1 1f 1b 0e 50 55 86 54 55 e8 ba 55 ba d5 e9 34 8c 32 f5 66 6b 17 6d 40 c7 3d 3c c5 38 82 2a Digest: 2c b5 5b e0 2e 25 96 b0 92 2c fa 14 8d 29 d8 e8 bf 56 4f 8d a9 5f 7a a2 34 0e c2 ac 31 1e 6a db TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 04 October 2025 19:09:02 -0400 (0:00:01.272) 0:20:19.471 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 04 October 2025 19:09:03 -0400 (0:00:00.591) 0:20:20.062 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 04 October 2025 19:09:03 -0400 (0:00:00.743) 0:20:20.805 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 04 October 2025 19:09:04 -0400 (0:00:00.552) 0:20:21.358 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 04 October 2025 19:09:04 -0400 (0:00:00.376) 0:20:21.734 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 04 October 2025 19:09:05 -0400 (0:00:00.823) 0:20:22.558 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption_key_size > 0", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 04 October 2025 19:09:05 -0400 (0:00:00.421) 0:20:22.979 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 04 October 2025 19:09:06 -0400 (0:00:00.336) 0:20:23.315 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 04 October 2025 19:09:07 -0400 (0:00:00.844) 0:20:24.160 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 04 October 2025 19:09:07 -0400 (0:00:00.880) 0:20:25.040 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 04 October 2025 19:09:08 -0400 (0:00:00.712) 0:20:25.753 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 04 October 2025 19:09:09 -0400 (0:00:00.799) 0:20:26.552 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 04 October 2025 19:09:10 -0400 (0:00:00.756) 0:20:27.309 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 04 October 2025 19:09:10 -0400 (0:00:00.421) 0:20:27.730 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 04 October 2025 19:09:10 -0400 (0:00:00.291) 0:20:28.022 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 04 October 2025 19:09:11 -0400 (0:00:00.330) 0:20:28.353 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 04 October 2025 19:09:11 -0400 (0:00:00.289) 0:20:28.642 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 04 October 2025 19:09:11 -0400 (0:00:00.289) 0:20:28.931 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 04 October 2025 19:09:12 -0400 (0:00:00.347) 0:20:29.279 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 04 October 2025 19:09:12 -0400 (0:00:00.277) 0:20:29.557 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 04 October 2025 19:09:12 -0400 (0:00:00.240) 0:20:29.797 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 04 October 2025 19:09:13 -0400 (0:00:00.274) 0:20:30.072 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 04 October 2025 19:09:13 -0400 (0:00:00.298) 0:20:30.371 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 04 October 2025 19:09:13 -0400 (0:00:00.302) 0:20:30.673 ****** ok: [managed-node8] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 04 October 2025 19:09:15 -0400 (0:00:01.699) 0:20:32.372 ****** ok: [managed-node8] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 04 October 2025 19:09:16 -0400 (0:00:01.580) 0:20:33.953 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 04 October 2025 19:09:17 -0400 (0:00:00.798) 0:20:34.751 ****** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 04 October 2025 19:09:18 -0400 (0:00:00.330) 0:20:35.082 ****** ok: [managed-node8] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 04 October 2025 19:09:19 -0400 (0:00:01.631) 0:20:36.714 ****** skipping: [managed-node8] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 04 October 2025 19:09:20 -0400 (0:00:00.724) 0:20:37.439 ****** skipping: [managed-node8] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 04 October 2025 19:09:20 -0400 (0:00:00.587) 0:20:38.026 ****** skipping: [managed-node8] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 04 October 2025 19:09:21 -0400 (0:00:00.701) 0:20:38.728 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 04 October 2025 19:09:22 -0400 (0:00:00.701) 0:20:39.429 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 04 October 2025 19:09:23 -0400 (0:00:00.761) 0:20:40.190 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 04 October 2025 19:09:23 -0400 (0:00:00.610) 0:20:40.801 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 04 October 2025 19:09:24 -0400 (0:00:00.712) 0:20:41.513 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 04 October 2025 19:09:25 -0400 (0:00:00.601) 0:20:42.114 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 04 October 2025 19:09:25 -0400 (0:00:00.661) 0:20:42.776 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 04 October 2025 19:09:26 -0400 (0:00:00.759) 0:20:43.535 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 04 October 2025 19:09:27 -0400 (0:00:00.672) 0:20:44.208 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 04 October 2025 19:09:27 -0400 (0:00:00.675) 0:20:44.883 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 04 October 2025 19:09:28 -0400 (0:00:00.655) 0:20:45.539 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 04 October 2025 19:09:29 -0400 (0:00:00.647) 0:20:46.186 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 04 October 2025 19:09:29 -0400 (0:00:00.712) 0:20:46.899 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 04 October 2025 19:09:30 -0400 (0:00:00.791) 0:20:47.690 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 04 October 2025 19:09:33 -0400 (0:00:02.834) 0:20:50.524 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 04 October 2025 19:09:34 -0400 (0:00:00.630) 0:20:51.154 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 04 October 2025 19:09:34 -0400 (0:00:00.727) 0:20:51.882 ****** ok: [managed-node8] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 04 October 2025 19:09:35 -0400 (0:00:00.430) 0:20:52.312 ****** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 04 October 2025 19:09:35 -0400 (0:00:00.394) 0:20:52.707 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 04 October 2025 19:09:36 -0400 (0:00:00.839) 0:20:53.546 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.033703", "end": "2025-10-04 19:09:37.703746", "rc": 0, "start": "2025-10-04 19:09:37.670043" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 04 October 2025 19:09:38 -0400 (0:00:01.509) 0:20:55.056 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 04 October 2025 19:09:38 -0400 (0:00:00.600) 0:20:55.656 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 04 October 2025 19:09:39 -0400 (0:00:00.697) 0:20:56.354 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 04 October 2025 19:09:39 -0400 (0:00:00.625) 0:20:56.980 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 04 October 2025 19:09:40 -0400 (0:00:00.682) 0:20:57.663 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 04 October 2025 19:09:41 -0400 (0:00:00.696) 0:20:58.359 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 04 October 2025 19:09:41 -0400 (0:00:00.645) 0:20:59.005 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 04 October 2025 19:09:42 -0400 (0:00:00.279) 0:20:59.284 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 04 October 2025 19:09:42 -0400 (0:00:00.480) 0:20:59.764 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 04 October 2025 19:09:43 -0400 (0:00:00.319) 0:21:00.084 ****** changed: [managed-node8] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 5] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:436 Saturday 04 October 2025 19:09:44 -0400 (0:00:01.297) 0:21:01.381 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 04 October 2025 19:09:45 -0400 (0:00:00.737) 0:21:02.119 ****** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 04 October 2025 19:09:45 -0400 (0:00:00.660) 0:21:02.779 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 19:09:46 -0400 (0:00:00.647) 0:21:03.427 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 19:09:46 -0400 (0:00:00.548) 0:21:03.975 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 19:09:47 -0400 (0:00:00.620) 0:21:04.596 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 19:09:48 -0400 (0:00:00.756) 0:21:05.353 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 19:09:48 -0400 (0:00:00.312) 0:21:05.665 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 19:09:48 -0400 (0:00:00.286) 0:21:05.952 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 19:09:49 -0400 (0:00:00.353) 0:21:06.305 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 19:09:49 -0400 (0:00:00.344) 0:21:06.650 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 19:09:50 -0400 (0:00:00.780) 0:21:07.431 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 19:09:52 -0400 (0:00:02.449) 0:21:09.880 ****** ok: [managed-node8] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 19:09:53 -0400 (0:00:00.725) 0:21:10.605 ****** ok: [managed-node8] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 19:09:54 -0400 (0:00:00.819) 0:21:11.425 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 19:09:57 -0400 (0:00:02.886) 0:21:14.312 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 19:09:57 -0400 (0:00:00.502) 0:21:14.814 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 19:09:58 -0400 (0:00:00.639) 0:21:15.454 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 19:09:59 -0400 (0:00:00.632) 0:21:16.087 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 19:09:59 -0400 (0:00:00.527) 0:21:16.614 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 19:10:02 -0400 (0:00:02.641) 0:21:19.255 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service": { "name": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 19:10:05 -0400 (0:00:03.457) 0:21:22.713 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 19:10:06 -0400 (0:00:00.824) 0:21:23.537 ****** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d5a76e9d8\x2d506a\x2d42b0\x2d8014\x2ddbfe2d14193b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "name": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-udevd-kernel.socket \"dev-mapper-foo\\\\x2dtest1.device\" systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\" cryptsetup-pre.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target \"blockdev@dev-mapper-luks\\\\x2d5a76e9d8\\\\x2d506a\\\\x2d42b0\\\\x2d8014\\\\x2ddbfe2d14193b.target\" cryptsetup.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13686", "LimitNPROCSoft": "13686", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13686", "LimitSIGPENDINGSoft": "13686", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d5a76e9d8\\\\x2d506a\\\\x2d42b0\\\\x2d8014\\\\x2ddbfe2d14193b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d5a76e9d8\\\\x2d506a\\\\x2d42b0\\\\x2d8014\\\\x2ddbfe2d14193b.device\"", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2025-10-04 19:07:41 EDT", "StateChangeTimestampMonotonic": "2811220606", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21897", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d5a76e9d8\\\\x2d506a\\\\x2d42b0\\\\x2d8014\\\\x2ddbfe2d14193b.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 19:10:08 -0400 (0:00:01.787) 0:21:25.325 ****** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Saturday 04 October 2025 19:10:11 -0400 (0:00:03.026) 0:21:28.351 ****** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 19:10:11 -0400 (0:00:00.549) 0:21:28.900 ****** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d5a76e9d8\x2d506a\x2d42b0\x2d8014\x2ddbfe2d14193b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "name": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454148096", "LimitMEMLOCKSoft": "454148096", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13686", "LimitNPROCSoft": "13686", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13686", "LimitSIGPENDINGSoft": "13686", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d5a76e9d8\\\\x2d506a\\\\x2d42b0\\\\x2d8014\\\\x2ddbfe2d14193b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2d5a76e9d8\\\\x2d506a\\\\x2d42b0\\\\x2d8014\\\\x2ddbfe2d14193b.device\" cryptsetup.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2025-10-04 19:07:41 EDT", "StateChangeTimestampMonotonic": "2811220606", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21897", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 04 October 2025 19:10:13 -0400 (0:00:01.637) 0:21:30.538 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 04 October 2025 19:10:13 -0400 (0:00:00.357) 0:21:30.896 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 04 October 2025 19:10:14 -0400 (0:00:00.522) 0:21:31.419 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 04 October 2025 19:10:14 -0400 (0:00:00.389) 0:21:31.809 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619384.0666564, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1759619384.0666564, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1759619384.0666564, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2759255684", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 04 October 2025 19:10:16 -0400 (0:00:01.350) 0:21:33.159 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 3] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:460 Saturday 04 October 2025 19:10:16 -0400 (0:00:00.411) 0:21:33.570 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 19:10:17 -0400 (0:00:00.800) 0:21:34.371 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 19:10:17 -0400 (0:00:00.429) 0:21:34.800 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 19:10:18 -0400 (0:00:00.674) 0:21:35.474 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 19:10:19 -0400 (0:00:00.825) 0:21:36.300 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 19:10:19 -0400 (0:00:00.324) 0:21:36.624 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 19:10:19 -0400 (0:00:00.353) 0:21:36.978 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 19:10:20 -0400 (0:00:00.332) 0:21:37.310 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 19:10:20 -0400 (0:00:00.389) 0:21:37.700 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 19:10:21 -0400 (0:00:00.775) 0:21:38.476 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 19:10:24 -0400 (0:00:02.745) 0:21:41.221 ****** ok: [managed-node8] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 19:10:26 -0400 (0:00:02.004) 0:21:43.225 ****** ok: [managed-node8] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 19:10:29 -0400 (0:00:03.141) 0:21:46.366 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 19:10:31 -0400 (0:00:02.606) 0:21:48.973 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 19:10:32 -0400 (0:00:00.758) 0:21:49.732 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 19:10:33 -0400 (0:00:00.540) 0:21:50.272 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 19:10:33 -0400 (0:00:00.781) 0:21:51.053 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 19:10:34 -0400 (0:00:00.564) 0:21:51.618 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 19:10:37 -0400 (0:00:02.497) 0:21:54.115 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service": { "name": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 19:10:41 -0400 (0:00:04.528) 0:21:58.644 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 19:10:42 -0400 (0:00:00.923) 0:21:59.567 ****** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d5a76e9d8\x2d506a\x2d42b0\x2d8014\x2ddbfe2d14193b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "name": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-udevd-kernel.socket systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\" cryptsetup-pre.target \"dev-mapper-foo\\\\x2dtest1.device\"", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2d5a76e9d8\\\\x2d506a\\\\x2d42b0\\\\x2d8014\\\\x2ddbfe2d14193b.target\"", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13686", "LimitNPROCSoft": "13686", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13686", "LimitSIGPENDINGSoft": "13686", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d5a76e9d8\\\\x2d506a\\\\x2d42b0\\\\x2d8014\\\\x2ddbfe2d14193b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2d5a76e9d8\\\\x2d506a\\\\x2d42b0\\\\x2d8014\\\\x2ddbfe2d14193b.device\" cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2025-10-04 19:07:41 EDT", "StateChangeTimestampMonotonic": "2811220606", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21897", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d5a76e9d8\\\\x2d506a\\\\x2d42b0\\\\x2d8014\\\\x2ddbfe2d14193b.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 19:10:44 -0400 (0:00:01.693) 0:22:01.260 ****** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Saturday 04 October 2025 19:10:47 -0400 (0:00:03.730) 0:22:04.991 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Saturday 04 October 2025 19:10:48 -0400 (0:00:00.647) 0:22:05.639 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619114.8183105, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "94299e3a6316e29f0fe5278f78e2734c255871dd", "ctime": 1759619114.8153105, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 683671747, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1759619114.8153105, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1448, "uid": 0, "version": "2862150521", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Saturday 04 October 2025 19:10:49 -0400 (0:00:01.133) 0:22:06.772 ****** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 19:10:51 -0400 (0:00:01.366) 0:22:08.139 ****** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d5a76e9d8\x2d506a\x2d42b0\x2d8014\x2ddbfe2d14193b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "name": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454148096", "LimitMEMLOCKSoft": "454148096", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13686", "LimitNPROCSoft": "13686", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13686", "LimitSIGPENDINGSoft": "13686", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d5a76e9d8\\\\x2d506a\\\\x2d42b0\\\\x2d8014\\\\x2ddbfe2d14193b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d5a76e9d8\\\\x2d506a\\\\x2d42b0\\\\x2d8014\\\\x2ddbfe2d14193b.device\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2025-10-04 19:07:41 EDT", "StateChangeTimestampMonotonic": "2811220606", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21897", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Saturday 04 October 2025 19:10:52 -0400 (0:00:01.796) 0:22:09.936 ****** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Saturday 04 October 2025 19:10:53 -0400 (0:00:00.361) 0:22:10.297 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Saturday 04 October 2025 19:10:53 -0400 (0:00:00.369) 0:22:10.667 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Saturday 04 October 2025 19:10:53 -0400 (0:00:00.355) 0:22:11.023 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': '/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Saturday 04 October 2025 19:10:55 -0400 (0:00:01.886) 0:22:12.909 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Saturday 04 October 2025 19:10:57 -0400 (0:00:02.009) 0:22:14.919 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Saturday 04 October 2025 19:10:59 -0400 (0:00:01.929) 0:22:16.848 ****** skipping: [managed-node8] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Saturday 04 October 2025 19:11:00 -0400 (0:00:00.708) 0:22:17.556 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Saturday 04 October 2025 19:11:02 -0400 (0:00:02.058) 0:22:19.615 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619129.088435, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "41dd92bc6da332a68d463b27bdd167c271fc924b", "ctime": 1759619121.2163663, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 658505925, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1759619121.2174883, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3090432077", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Saturday 04 October 2025 19:11:03 -0400 (0:00:01.260) 0:22:20.876 ****** changed: [managed-node8] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Saturday 04 October 2025 19:11:05 -0400 (0:00:01.942) 0:22:22.819 ****** ok: [managed-node8] TASK [Verify role results - 9] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:477 Saturday 04 October 2025 19:11:08 -0400 (0:00:02.477) 0:22:25.297 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 04 October 2025 19:11:09 -0400 (0:00:00.792) 0:22:26.089 ****** ok: [managed-node8] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 04 October 2025 19:11:09 -0400 (0:00:00.615) 0:22:26.705 ****** skipping: [managed-node8] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 04 October 2025 19:11:10 -0400 (0:00:00.671) 0:22:27.377 ****** ok: [managed-node8] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "f6b0fd87-8ff1-45eb-8974-705a8be0907d" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "4dp1Fv-bvao-1UZJ-Uroo-RdD2-Lzni-JhB8EB" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "8f91d125-a2a0-4395-9c60-8981f3d516e2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 04 October 2025 19:11:11 -0400 (0:00:01.298) 0:22:28.675 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.004638", "end": "2025-10-04 19:11:13.574474", "rc": 0, "start": "2025-10-04 19:11:12.569836" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Sep 30 13:43:10 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=8f91d125-a2a0-4395-9c60-8981f3d516e2 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 04 October 2025 19:11:13 -0400 (0:00:02.269) 0:22:30.945 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003470", "end": "2025-10-04 19:11:14.916849", "failed_when_result": false, "rc": 0, "start": "2025-10-04 19:11:14.913379" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 04 October 2025 19:11:15 -0400 (0:00:01.295) 0:22:32.241 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node8 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 04 October 2025 19:11:16 -0400 (0:00:01.067) 0:22:33.308 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 04 October 2025 19:11:16 -0400 (0:00:00.242) 0:22:33.551 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.033952", "end": "2025-10-04 19:11:17.572882", "rc": 0, "start": "2025-10-04 19:11:17.538930" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 04 October 2025 19:11:17 -0400 (0:00:01.392) 0:22:34.943 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 04 October 2025 19:11:18 -0400 (0:00:00.405) 0:22:35.349 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node8 => (item=members) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node8 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 04 October 2025 19:11:19 -0400 (0:00:00.862) 0:22:36.211 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 04 October 2025 19:11:19 -0400 (0:00:00.789) 0:22:37.001 ****** ok: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 04 October 2025 19:11:21 -0400 (0:00:01.347) 0:22:38.349 ****** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 04 October 2025 19:11:21 -0400 (0:00:00.641) 0:22:38.990 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 04 October 2025 19:11:22 -0400 (0:00:00.842) 0:22:39.833 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 04 October 2025 19:11:23 -0400 (0:00:00.811) 0:22:40.645 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 04 October 2025 19:11:23 -0400 (0:00:00.341) 0:22:40.987 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 04 October 2025 19:11:24 -0400 (0:00:00.632) 0:22:41.620 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Saturday 04 October 2025 19:11:24 -0400 (0:00:00.366) 0:22:41.986 ****** ok: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Saturday 04 October 2025 19:11:25 -0400 (0:00:00.391) 0:22:42.377 ****** ok: [managed-node8] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:93414): WARNING **: 19:11:26.365: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.173 originally 10.31.44.173 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.173 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.173 originally 10.31.44.173 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6fa89a8227' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.173 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Saturday 04 October 2025 19:11:26 -0400 (0:00:01.411) 0:22:43.789 ****** skipping: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Saturday 04 October 2025 19:11:27 -0400 (0:00:00.608) 0:22:44.398 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node8 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 04 October 2025 19:11:27 -0400 (0:00:00.641) 0:22:45.048 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 04 October 2025 19:11:28 -0400 (0:00:00.234) 0:22:45.283 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 04 October 2025 19:11:28 -0400 (0:00:00.308) 0:22:45.591 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 04 October 2025 19:11:28 -0400 (0:00:00.369) 0:22:45.960 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 04 October 2025 19:11:29 -0400 (0:00:00.298) 0:22:46.259 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 04 October 2025 19:11:29 -0400 (0:00:00.253) 0:22:46.513 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 04 October 2025 19:11:29 -0400 (0:00:00.276) 0:22:46.790 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 04 October 2025 19:11:30 -0400 (0:00:00.344) 0:22:47.134 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 04 October 2025 19:11:30 -0400 (0:00:00.250) 0:22:47.385 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 04 October 2025 19:11:30 -0400 (0:00:00.314) 0:22:47.699 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 04 October 2025 19:11:30 -0400 (0:00:00.280) 0:22:47.980 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Saturday 04 October 2025 19:11:31 -0400 (0:00:00.312) 0:22:48.292 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node8 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 04 October 2025 19:11:31 -0400 (0:00:00.633) 0:22:48.926 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node8 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Saturday 04 October 2025 19:11:33 -0400 (0:00:01.738) 0:22:50.665 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Saturday 04 October 2025 19:11:34 -0400 (0:00:00.456) 0:22:51.121 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Saturday 04 October 2025 19:11:34 -0400 (0:00:00.435) 0:22:51.557 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Saturday 04 October 2025 19:11:34 -0400 (0:00:00.380) 0:22:51.937 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Saturday 04 October 2025 19:11:35 -0400 (0:00:00.373) 0:22:52.310 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Saturday 04 October 2025 19:11:35 -0400 (0:00:00.449) 0:22:52.760 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Saturday 04 October 2025 19:11:36 -0400 (0:00:00.514) 0:22:53.275 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Saturday 04 October 2025 19:11:36 -0400 (0:00:00.388) 0:22:53.664 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node8 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 04 October 2025 19:11:37 -0400 (0:00:00.822) 0:22:54.486 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node8 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Saturday 04 October 2025 19:11:38 -0400 (0:00:00.633) 0:22:55.120 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Saturday 04 October 2025 19:11:38 -0400 (0:00:00.333) 0:22:55.454 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Saturday 04 October 2025 19:11:38 -0400 (0:00:00.319) 0:22:55.773 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Saturday 04 October 2025 19:11:39 -0400 (0:00:00.349) 0:22:56.123 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Saturday 04 October 2025 19:11:39 -0400 (0:00:00.344) 0:22:56.467 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 04 October 2025 19:11:40 -0400 (0:00:00.865) 0:22:57.333 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 04 October 2025 19:11:40 -0400 (0:00:00.687) 0:22:58.021 ****** skipping: [managed-node8] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 04 October 2025 19:11:41 -0400 (0:00:00.373) 0:22:58.394 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node8 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Saturday 04 October 2025 19:11:41 -0400 (0:00:00.610) 0:22:59.004 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Saturday 04 October 2025 19:11:42 -0400 (0:00:00.863) 0:22:59.867 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Saturday 04 October 2025 19:11:43 -0400 (0:00:00.765) 0:23:00.633 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Saturday 04 October 2025 19:11:44 -0400 (0:00:00.686) 0:23:01.319 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Saturday 04 October 2025 19:11:44 -0400 (0:00:00.646) 0:23:01.965 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Saturday 04 October 2025 19:11:45 -0400 (0:00:00.612) 0:23:02.578 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 04 October 2025 19:11:45 -0400 (0:00:00.304) 0:23:02.883 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Saturday 04 October 2025 19:11:46 -0400 (0:00:00.305) 0:23:03.188 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node8 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 04 October 2025 19:11:46 -0400 (0:00:00.777) 0:23:03.966 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node8 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Saturday 04 October 2025 19:11:47 -0400 (0:00:00.657) 0:23:04.623 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Saturday 04 October 2025 19:11:47 -0400 (0:00:00.356) 0:23:04.980 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Saturday 04 October 2025 19:11:48 -0400 (0:00:00.329) 0:23:05.309 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Saturday 04 October 2025 19:11:48 -0400 (0:00:00.300) 0:23:05.610 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Saturday 04 October 2025 19:11:48 -0400 (0:00:00.289) 0:23:05.900 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Saturday 04 October 2025 19:11:49 -0400 (0:00:00.311) 0:23:06.212 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Saturday 04 October 2025 19:11:49 -0400 (0:00:00.267) 0:23:06.480 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Saturday 04 October 2025 19:11:49 -0400 (0:00:00.407) 0:23:06.888 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node8 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 04 October 2025 19:11:50 -0400 (0:00:00.988) 0:23:07.877 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 04 October 2025 19:11:51 -0400 (0:00:00.286) 0:23:08.163 ****** skipping: [managed-node8] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Saturday 04 October 2025 19:11:51 -0400 (0:00:00.301) 0:23:08.465 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Saturday 04 October 2025 19:11:51 -0400 (0:00:00.367) 0:23:08.832 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Saturday 04 October 2025 19:11:52 -0400 (0:00:00.315) 0:23:09.148 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Saturday 04 October 2025 19:11:52 -0400 (0:00:00.267) 0:23:09.415 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Saturday 04 October 2025 19:11:52 -0400 (0:00:00.209) 0:23:09.624 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Saturday 04 October 2025 19:11:52 -0400 (0:00:00.279) 0:23:09.904 ****** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 04 October 2025 19:11:53 -0400 (0:00:00.311) 0:23:10.215 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 04 October 2025 19:11:53 -0400 (0:00:00.608) 0:23:10.823 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 04 October 2025 19:11:54 -0400 (0:00:00.656) 0:23:11.480 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 => (item=mount) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 => (item=fstab) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 => (item=fs) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 => (item=device) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 => (item=encryption) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 => (item=md) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 => (item=size) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 04 October 2025 19:11:56 -0400 (0:00:01.843) 0:23:13.324 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 04 October 2025 19:11:56 -0400 (0:00:00.499) 0:23:13.824 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 04 October 2025 19:11:57 -0400 (0:00:00.760) 0:23:14.584 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 04 October 2025 19:11:58 -0400 (0:00:00.781) 0:23:15.366 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 04 October 2025 19:11:58 -0400 (0:00:00.479) 0:23:15.846 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 04 October 2025 19:11:59 -0400 (0:00:00.766) 0:23:16.612 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 04 October 2025 19:12:00 -0400 (0:00:00.716) 0:23:17.329 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 04 October 2025 19:12:00 -0400 (0:00:00.674) 0:23:18.003 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 04 October 2025 19:12:01 -0400 (0:00:00.266) 0:23:18.270 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 04 October 2025 19:12:01 -0400 (0:00:00.251) 0:23:18.521 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 04 October 2025 19:12:01 -0400 (0:00:00.342) 0:23:18.864 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 04 October 2025 19:12:02 -0400 (0:00:00.393) 0:23:19.258 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 04 October 2025 19:12:03 -0400 (0:00:00.989) 0:23:20.247 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 04 October 2025 19:12:03 -0400 (0:00:00.773) 0:23:21.021 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 04 October 2025 19:12:04 -0400 (0:00:00.710) 0:23:21.732 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 04 October 2025 19:12:05 -0400 (0:00:00.698) 0:23:22.430 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 04 October 2025 19:12:06 -0400 (0:00:00.809) 0:23:23.240 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 04 October 2025 19:12:06 -0400 (0:00:00.298) 0:23:23.538 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 04 October 2025 19:12:07 -0400 (0:00:00.873) 0:23:24.411 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 04 October 2025 19:12:08 -0400 (0:00:00.774) 0:23:25.186 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619447.5422094, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759619447.5422094, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2050, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1759619447.5422094, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 04 October 2025 19:12:09 -0400 (0:00:01.278) 0:23:26.464 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 04 October 2025 19:12:09 -0400 (0:00:00.368) 0:23:26.833 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 04 October 2025 19:12:10 -0400 (0:00:00.344) 0:23:27.178 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 04 October 2025 19:12:10 -0400 (0:00:00.394) 0:23:27.572 ****** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 04 October 2025 19:12:10 -0400 (0:00:00.374) 0:23:27.946 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 04 October 2025 19:12:11 -0400 (0:00:00.240) 0:23:28.187 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 04 October 2025 19:12:11 -0400 (0:00:00.388) 0:23:28.575 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 04 October 2025 19:12:11 -0400 (0:00:00.308) 0:23:28.884 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 04 October 2025 19:12:16 -0400 (0:00:04.774) 0:23:33.658 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 04 October 2025 19:12:16 -0400 (0:00:00.316) 0:23:33.975 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 04 October 2025 19:12:17 -0400 (0:00:00.276) 0:23:34.251 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 04 October 2025 19:12:17 -0400 (0:00:00.789) 0:23:35.041 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 04 October 2025 19:12:18 -0400 (0:00:00.282) 0:23:35.323 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 04 October 2025 19:12:18 -0400 (0:00:00.304) 0:23:35.628 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 04 October 2025 19:12:18 -0400 (0:00:00.413) 0:23:36.041 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 04 October 2025 19:12:19 -0400 (0:00:00.315) 0:23:36.356 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 04 October 2025 19:12:19 -0400 (0:00:00.297) 0:23:36.653 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 04 October 2025 19:12:20 -0400 (0:00:00.749) 0:23:37.403 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 04 October 2025 19:12:21 -0400 (0:00:00.660) 0:23:38.064 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 04 October 2025 19:12:21 -0400 (0:00:00.801) 0:23:38.865 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 04 October 2025 19:12:22 -0400 (0:00:00.801) 0:23:39.667 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 04 October 2025 19:12:23 -0400 (0:00:00.544) 0:23:40.211 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 04 October 2025 19:12:23 -0400 (0:00:00.333) 0:23:40.544 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 04 October 2025 19:12:23 -0400 (0:00:00.334) 0:23:40.879 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 04 October 2025 19:12:24 -0400 (0:00:00.294) 0:23:41.174 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 04 October 2025 19:12:24 -0400 (0:00:00.257) 0:23:41.431 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 04 October 2025 19:12:24 -0400 (0:00:00.367) 0:23:41.799 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 04 October 2025 19:12:25 -0400 (0:00:00.265) 0:23:42.065 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 04 October 2025 19:12:25 -0400 (0:00:00.326) 0:23:42.392 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 04 October 2025 19:12:25 -0400 (0:00:00.343) 0:23:42.735 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 04 October 2025 19:12:25 -0400 (0:00:00.286) 0:23:43.021 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 04 October 2025 19:12:26 -0400 (0:00:00.191) 0:23:43.213 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 04 October 2025 19:12:26 -0400 (0:00:00.228) 0:23:43.441 ****** ok: [managed-node8] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 04 October 2025 19:12:28 -0400 (0:00:01.695) 0:23:45.137 ****** ok: [managed-node8] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 04 October 2025 19:12:29 -0400 (0:00:01.537) 0:23:46.675 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 04 October 2025 19:12:30 -0400 (0:00:00.735) 0:23:47.410 ****** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 04 October 2025 19:12:30 -0400 (0:00:00.399) 0:23:47.809 ****** ok: [managed-node8] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 04 October 2025 19:12:32 -0400 (0:00:01.719) 0:23:49.529 ****** skipping: [managed-node8] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 04 October 2025 19:12:33 -0400 (0:00:00.799) 0:23:50.329 ****** skipping: [managed-node8] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 04 October 2025 19:12:33 -0400 (0:00:00.709) 0:23:51.038 ****** skipping: [managed-node8] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 04 October 2025 19:12:34 -0400 (0:00:00.699) 0:23:51.737 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 04 October 2025 19:12:35 -0400 (0:00:00.707) 0:23:52.445 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 04 October 2025 19:12:35 -0400 (0:00:00.547) 0:23:53.015 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 04 October 2025 19:12:36 -0400 (0:00:00.720) 0:23:53.735 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 04 October 2025 19:12:37 -0400 (0:00:00.843) 0:23:54.579 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 04 October 2025 19:12:38 -0400 (0:00:00.778) 0:23:55.357 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 04 October 2025 19:12:38 -0400 (0:00:00.646) 0:23:56.004 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 04 October 2025 19:12:39 -0400 (0:00:00.557) 0:23:56.562 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 04 October 2025 19:12:40 -0400 (0:00:00.654) 0:23:57.216 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 04 October 2025 19:12:40 -0400 (0:00:00.699) 0:23:57.915 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 04 October 2025 19:12:41 -0400 (0:00:00.773) 0:23:58.688 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 04 October 2025 19:12:42 -0400 (0:00:00.736) 0:23:59.425 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 04 October 2025 19:12:43 -0400 (0:00:00.660) 0:24:00.086 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 04 October 2025 19:12:43 -0400 (0:00:00.708) 0:24:00.794 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 04 October 2025 19:12:44 -0400 (0:00:00.637) 0:24:01.432 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 04 October 2025 19:12:45 -0400 (0:00:00.812) 0:24:02.245 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 04 October 2025 19:12:45 -0400 (0:00:00.711) 0:24:02.957 ****** ok: [managed-node8] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 04 October 2025 19:12:46 -0400 (0:00:00.379) 0:24:03.336 ****** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 04 October 2025 19:12:46 -0400 (0:00:00.326) 0:24:03.663 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 04 October 2025 19:12:47 -0400 (0:00:00.760) 0:24:04.423 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.038057", "end": "2025-10-04 19:12:48.473708", "rc": 0, "start": "2025-10-04 19:12:48.435651" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 04 October 2025 19:12:48 -0400 (0:00:01.387) 0:24:05.810 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 04 October 2025 19:12:49 -0400 (0:00:00.747) 0:24:06.558 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 04 October 2025 19:12:50 -0400 (0:00:00.848) 0:24:07.407 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 04 October 2025 19:12:50 -0400 (0:00:00.593) 0:24:08.000 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 04 October 2025 19:12:51 -0400 (0:00:00.605) 0:24:08.605 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 04 October 2025 19:12:52 -0400 (0:00:00.565) 0:24:09.171 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 04 October 2025 19:12:52 -0400 (0:00:00.685) 0:24:09.856 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 04 October 2025 19:12:53 -0400 (0:00:00.315) 0:24:10.172 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 04 October 2025 19:12:53 -0400 (0:00:00.701) 0:24:10.874 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 04 October 2025 19:12:54 -0400 (0:00:00.307) 0:24:11.182 ****** changed: [managed-node8] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 6] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:483 Saturday 04 October 2025 19:12:55 -0400 (0:00:01.369) 0:24:12.551 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 04 October 2025 19:12:56 -0400 (0:00:00.762) 0:24:13.314 ****** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 04 October 2025 19:12:56 -0400 (0:00:00.673) 0:24:13.987 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 19:12:57 -0400 (0:00:00.589) 0:24:14.576 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 19:12:58 -0400 (0:00:00.491) 0:24:15.068 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 19:12:58 -0400 (0:00:00.787) 0:24:15.855 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 19:12:59 -0400 (0:00:00.784) 0:24:16.640 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 19:12:59 -0400 (0:00:00.385) 0:24:17.026 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 19:13:00 -0400 (0:00:00.408) 0:24:17.435 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 19:13:00 -0400 (0:00:00.364) 0:24:17.799 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 19:13:01 -0400 (0:00:00.320) 0:24:18.119 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 19:13:02 -0400 (0:00:01.046) 0:24:19.166 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 19:13:04 -0400 (0:00:02.494) 0:24:21.661 ****** ok: [managed-node8] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 19:13:05 -0400 (0:00:00.714) 0:24:22.375 ****** ok: [managed-node8] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 19:13:06 -0400 (0:00:00.744) 0:24:23.120 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 19:13:08 -0400 (0:00:02.791) 0:24:25.911 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 19:13:09 -0400 (0:00:00.589) 0:24:26.500 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 19:13:09 -0400 (0:00:00.518) 0:24:27.018 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 19:13:10 -0400 (0:00:00.645) 0:24:27.664 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 19:13:11 -0400 (0:00:00.619) 0:24:28.284 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 19:13:16 -0400 (0:00:05.118) 0:24:33.403 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service": { "name": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 19:13:20 -0400 (0:00:04.178) 0:24:37.581 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 19:13:21 -0400 (0:00:00.926) 0:24:38.507 ****** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d5a76e9d8\x2d506a\x2d42b0\x2d8014\x2ddbfe2d14193b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "name": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-udevd-kernel.socket cryptsetup-pre.target \"system-systemd\\\\x2dcryptsetup.slice\" \"dev-mapper-foo\\\\x2dtest1.device\" systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target \"blockdev@dev-mapper-luks\\\\x2d5a76e9d8\\\\x2d506a\\\\x2d42b0\\\\x2d8014\\\\x2ddbfe2d14193b.target\"", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13686", "LimitNPROCSoft": "13686", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13686", "LimitSIGPENDINGSoft": "13686", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d5a76e9d8\\\\x2d506a\\\\x2d42b0\\\\x2d8014\\\\x2ddbfe2d14193b.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2025-10-04 19:07:41 EDT", "StateChangeTimestampMonotonic": "2811220606", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21897", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d5a76e9d8\\\\x2d506a\\\\x2d42b0\\\\x2d8014\\\\x2ddbfe2d14193b.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 19:13:22 -0400 (0:00:01.488) 0:24:39.996 ****** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Saturday 04 October 2025 19:13:25 -0400 (0:00:02.665) 0:24:42.661 ****** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 19:13:26 -0400 (0:00:00.438) 0:24:43.099 ****** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d5a76e9d8\x2d506a\x2d42b0\x2d8014\x2ddbfe2d14193b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "name": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454148096", "LimitMEMLOCKSoft": "454148096", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13686", "LimitNPROCSoft": "13686", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13686", "LimitSIGPENDINGSoft": "13686", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d5a76e9d8\\x2d506a\\x2d42b0\\x2d8014\\x2ddbfe2d14193b.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d5a76e9d8\\\\x2d506a\\\\x2d42b0\\\\x2d8014\\\\x2ddbfe2d14193b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21897", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 04 October 2025 19:13:27 -0400 (0:00:01.773) 0:24:44.874 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 04 October 2025 19:13:28 -0400 (0:00:00.288) 0:24:45.162 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 04 October 2025 19:13:28 -0400 (0:00:00.449) 0:24:45.612 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 04 October 2025 19:13:28 -0400 (0:00:00.360) 0:24:45.972 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619575.2083216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1759619575.2083216, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1759619575.2083216, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "874405791", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 04 October 2025 19:13:30 -0400 (0:00:01.353) 0:24:47.325 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume - 3] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:507 Saturday 04 October 2025 19:13:30 -0400 (0:00:00.455) 0:24:47.781 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 19:13:32 -0400 (0:00:01.306) 0:24:49.087 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 19:13:32 -0400 (0:00:00.500) 0:24:49.587 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 19:13:33 -0400 (0:00:00.620) 0:24:50.208 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 19:13:34 -0400 (0:00:00.919) 0:24:51.127 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 19:13:34 -0400 (0:00:00.357) 0:24:51.484 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 19:13:34 -0400 (0:00:00.314) 0:24:51.799 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 19:13:35 -0400 (0:00:00.325) 0:24:52.125 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 19:13:35 -0400 (0:00:00.344) 0:24:52.469 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 19:13:36 -0400 (0:00:00.905) 0:24:53.374 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 19:13:38 -0400 (0:00:02.521) 0:24:55.896 ****** ok: [managed-node8] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 19:13:39 -0400 (0:00:00.670) 0:24:56.567 ****** ok: [managed-node8] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 19:13:40 -0400 (0:00:00.727) 0:24:57.294 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 19:13:43 -0400 (0:00:02.810) 0:25:00.104 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 19:13:43 -0400 (0:00:00.726) 0:25:00.831 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 19:13:44 -0400 (0:00:00.665) 0:25:01.496 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 19:13:45 -0400 (0:00:00.672) 0:25:02.169 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 19:13:45 -0400 (0:00:00.588) 0:25:02.758 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 19:13:48 -0400 (0:00:02.485) 0:25:05.244 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 19:13:51 -0400 (0:00:03.228) 0:25:08.473 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 19:13:52 -0400 (0:00:00.898) 0:25:09.371 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 19:13:52 -0400 (0:00:00.189) 0:25:09.561 ****** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Saturday 04 October 2025 19:14:04 -0400 (0:00:11.914) 0:25:21.476 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Saturday 04 October 2025 19:14:05 -0400 (0:00:00.716) 0:25:22.192 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619459.5083137, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "9ccdac7d5c44b40b0c45d3acff3c444c65861dec", "ctime": 1759619459.5053134, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 683671747, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1759619459.5053134, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1416, "uid": 0, "version": "2862150521", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Saturday 04 October 2025 19:14:06 -0400 (0:00:01.334) 0:25:23.527 ****** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 19:14:07 -0400 (0:00:01.522) 0:25:25.050 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Saturday 04 October 2025 19:14:08 -0400 (0:00:00.243) 0:25:25.293 ****** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Saturday 04 October 2025 19:14:08 -0400 (0:00:00.455) 0:25:25.749 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Saturday 04 October 2025 19:14:09 -0400 (0:00:00.419) 0:25:26.168 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Saturday 04 October 2025 19:14:09 -0400 (0:00:00.419) 0:25:26.588 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Saturday 04 October 2025 19:14:11 -0400 (0:00:01.792) 0:25:28.381 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Saturday 04 October 2025 19:14:13 -0400 (0:00:02.170) 0:25:30.551 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': '/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Saturday 04 October 2025 19:14:15 -0400 (0:00:01.838) 0:25:32.390 ****** skipping: [managed-node8] => (item={'src': '/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Saturday 04 October 2025 19:14:16 -0400 (0:00:00.719) 0:25:33.110 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Saturday 04 October 2025 19:14:18 -0400 (0:00:02.076) 0:25:35.186 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619474.9154477, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1759619465.4483652, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 574619854, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1759619465.449555, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3537907179", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Saturday 04 October 2025 19:14:19 -0400 (0:00:01.321) 0:25:36.508 ****** changed: [managed-node8] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-3b6d1c56-6095-4eba-8c81-e126b4142eba', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Saturday 04 October 2025 19:14:21 -0400 (0:00:01.645) 0:25:38.154 ****** ok: [managed-node8] TASK [Verify role results - 10] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:524 Saturday 04 October 2025 19:14:23 -0400 (0:00:02.321) 0:25:40.475 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 04 October 2025 19:14:24 -0400 (0:00:00.883) 0:25:41.359 ****** ok: [managed-node8] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 04 October 2025 19:14:24 -0400 (0:00:00.672) 0:25:42.031 ****** skipping: [managed-node8] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 04 October 2025 19:14:25 -0400 (0:00:00.622) 0:25:42.654 ****** ok: [managed-node8] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "3b6d1c56-6095-4eba-8c81-e126b4142eba" }, "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "size": "4G", "type": "crypt", "uuid": "21ba3c7e-2716-4840-ba54-7a2e40ffa415" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "4dp1Fv-bvao-1UZJ-Uroo-RdD2-Lzni-JhB8EB" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "8f91d125-a2a0-4395-9c60-8981f3d516e2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 04 October 2025 19:14:26 -0400 (0:00:01.241) 0:25:43.895 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003105", "end": "2025-10-04 19:14:27.800840", "rc": 0, "start": "2025-10-04 19:14:27.797735" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Sep 30 13:43:10 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=8f91d125-a2a0-4395-9c60-8981f3d516e2 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 04 October 2025 19:14:28 -0400 (0:00:01.188) 0:25:45.083 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003206", "end": "2025-10-04 19:14:28.963293", "failed_when_result": false, "rc": 0, "start": "2025-10-04 19:14:28.960087" } STDOUT: luks-3b6d1c56-6095-4eba-8c81-e126b4142eba /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 04 October 2025 19:14:29 -0400 (0:00:01.249) 0:25:46.333 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node8 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 04 October 2025 19:14:30 -0400 (0:00:00.847) 0:25:47.180 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 04 October 2025 19:14:30 -0400 (0:00:00.280) 0:25:47.461 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.031163", "end": "2025-10-04 19:14:31.487600", "rc": 0, "start": "2025-10-04 19:14:31.456437" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 04 October 2025 19:14:31 -0400 (0:00:01.340) 0:25:48.801 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 04 October 2025 19:14:32 -0400 (0:00:00.436) 0:25:49.237 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node8 => (item=members) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node8 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 04 October 2025 19:14:34 -0400 (0:00:02.125) 0:25:51.362 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 04 October 2025 19:14:35 -0400 (0:00:00.727) 0:25:52.090 ****** ok: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 04 October 2025 19:14:36 -0400 (0:00:01.346) 0:25:53.437 ****** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 04 October 2025 19:14:37 -0400 (0:00:00.707) 0:25:54.144 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 04 October 2025 19:14:37 -0400 (0:00:00.802) 0:25:54.946 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 04 October 2025 19:14:38 -0400 (0:00:00.745) 0:25:55.692 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 04 October 2025 19:14:39 -0400 (0:00:00.385) 0:25:56.077 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 04 October 2025 19:14:39 -0400 (0:00:00.689) 0:25:56.767 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Saturday 04 October 2025 19:14:40 -0400 (0:00:00.373) 0:25:57.141 ****** ok: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Saturday 04 October 2025 19:14:40 -0400 (0:00:00.391) 0:25:57.532 ****** ok: [managed-node8] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:99563): WARNING **: 19:14:41.419: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.173 originally 10.31.44.173 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.173 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.173 originally 10.31.44.173 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6fa89a8227' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.173 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Saturday 04 October 2025 19:14:41 -0400 (0:00:01.338) 0:25:58.870 ****** skipping: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Saturday 04 October 2025 19:14:42 -0400 (0:00:00.676) 0:25:59.547 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node8 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 04 October 2025 19:14:43 -0400 (0:00:00.804) 0:26:00.352 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 04 October 2025 19:14:43 -0400 (0:00:00.316) 0:26:00.668 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 04 October 2025 19:14:44 -0400 (0:00:00.400) 0:26:01.069 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 04 October 2025 19:14:44 -0400 (0:00:00.305) 0:26:01.374 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 04 October 2025 19:14:44 -0400 (0:00:00.340) 0:26:01.714 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 04 October 2025 19:14:44 -0400 (0:00:00.281) 0:26:01.996 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 04 October 2025 19:14:45 -0400 (0:00:00.295) 0:26:02.292 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 04 October 2025 19:14:45 -0400 (0:00:00.304) 0:26:02.596 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 04 October 2025 19:14:45 -0400 (0:00:00.327) 0:26:02.924 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 04 October 2025 19:14:46 -0400 (0:00:00.274) 0:26:03.198 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 04 October 2025 19:14:46 -0400 (0:00:00.241) 0:26:03.440 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Saturday 04 October 2025 19:14:46 -0400 (0:00:00.279) 0:26:03.720 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node8 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 04 October 2025 19:14:47 -0400 (0:00:00.700) 0:26:04.420 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node8 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Saturday 04 October 2025 19:14:47 -0400 (0:00:00.553) 0:26:04.974 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Saturday 04 October 2025 19:14:48 -0400 (0:00:00.359) 0:26:05.334 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Saturday 04 October 2025 19:14:48 -0400 (0:00:00.330) 0:26:05.664 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Saturday 04 October 2025 19:14:48 -0400 (0:00:00.389) 0:26:06.054 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Saturday 04 October 2025 19:14:49 -0400 (0:00:00.369) 0:26:06.424 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Saturday 04 October 2025 19:14:49 -0400 (0:00:00.363) 0:26:06.788 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Saturday 04 October 2025 19:14:50 -0400 (0:00:00.369) 0:26:07.157 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Saturday 04 October 2025 19:14:50 -0400 (0:00:00.386) 0:26:07.544 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node8 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 04 October 2025 19:14:51 -0400 (0:00:00.773) 0:26:08.317 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node8 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Saturday 04 October 2025 19:14:51 -0400 (0:00:00.643) 0:26:08.961 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Saturday 04 October 2025 19:14:52 -0400 (0:00:00.262) 0:26:09.223 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Saturday 04 October 2025 19:14:52 -0400 (0:00:00.323) 0:26:09.547 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Saturday 04 October 2025 19:14:52 -0400 (0:00:00.273) 0:26:09.821 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Saturday 04 October 2025 19:14:53 -0400 (0:00:00.353) 0:26:10.175 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 04 October 2025 19:14:53 -0400 (0:00:00.864) 0:26:11.039 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 04 October 2025 19:14:54 -0400 (0:00:00.652) 0:26:11.692 ****** skipping: [managed-node8] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 04 October 2025 19:14:54 -0400 (0:00:00.309) 0:26:12.001 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node8 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Saturday 04 October 2025 19:14:55 -0400 (0:00:00.664) 0:26:12.666 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Saturday 04 October 2025 19:14:56 -0400 (0:00:00.817) 0:26:13.483 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Saturday 04 October 2025 19:14:57 -0400 (0:00:00.780) 0:26:14.263 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Saturday 04 October 2025 19:14:57 -0400 (0:00:00.635) 0:26:14.899 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Saturday 04 October 2025 19:14:58 -0400 (0:00:00.684) 0:26:15.584 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Saturday 04 October 2025 19:14:59 -0400 (0:00:00.760) 0:26:16.344 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 04 October 2025 19:14:59 -0400 (0:00:00.341) 0:26:16.686 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Saturday 04 October 2025 19:14:59 -0400 (0:00:00.310) 0:26:16.996 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node8 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 04 October 2025 19:15:00 -0400 (0:00:00.777) 0:26:17.773 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node8 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Saturday 04 October 2025 19:15:01 -0400 (0:00:00.738) 0:26:18.512 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Saturday 04 October 2025 19:15:01 -0400 (0:00:00.284) 0:26:18.796 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Saturday 04 October 2025 19:15:02 -0400 (0:00:00.337) 0:26:19.134 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Saturday 04 October 2025 19:15:02 -0400 (0:00:00.277) 0:26:19.411 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Saturday 04 October 2025 19:15:02 -0400 (0:00:00.280) 0:26:19.692 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Saturday 04 October 2025 19:15:03 -0400 (0:00:00.374) 0:26:20.080 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Saturday 04 October 2025 19:15:03 -0400 (0:00:00.353) 0:26:20.433 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Saturday 04 October 2025 19:15:03 -0400 (0:00:00.278) 0:26:20.712 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node8 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 04 October 2025 19:15:04 -0400 (0:00:00.838) 0:26:21.550 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 04 October 2025 19:15:04 -0400 (0:00:00.281) 0:26:21.832 ****** skipping: [managed-node8] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Saturday 04 October 2025 19:15:05 -0400 (0:00:00.283) 0:26:22.116 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Saturday 04 October 2025 19:15:05 -0400 (0:00:00.236) 0:26:22.353 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Saturday 04 October 2025 19:15:05 -0400 (0:00:00.286) 0:26:22.639 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Saturday 04 October 2025 19:15:05 -0400 (0:00:00.288) 0:26:22.927 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Saturday 04 October 2025 19:15:06 -0400 (0:00:00.255) 0:26:23.182 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Saturday 04 October 2025 19:15:06 -0400 (0:00:00.399) 0:26:23.582 ****** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 04 October 2025 19:15:06 -0400 (0:00:00.358) 0:26:23.940 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 04 October 2025 19:15:07 -0400 (0:00:00.582) 0:26:24.522 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 04 October 2025 19:15:08 -0400 (0:00:00.645) 0:26:25.168 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 => (item=mount) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 => (item=fstab) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 => (item=fs) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 => (item=device) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 => (item=encryption) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 => (item=md) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 => (item=size) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 04 October 2025 19:15:10 -0400 (0:00:01.887) 0:26:27.055 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 04 October 2025 19:15:10 -0400 (0:00:00.469) 0:26:27.525 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 04 October 2025 19:15:13 -0400 (0:00:03.465) 0:26:30.990 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 04 October 2025 19:15:14 -0400 (0:00:00.712) 0:26:31.702 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 04 October 2025 19:15:15 -0400 (0:00:00.390) 0:26:32.093 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 04 October 2025 19:15:15 -0400 (0:00:00.844) 0:26:32.938 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 04 October 2025 19:15:16 -0400 (0:00:00.757) 0:26:33.695 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 04 October 2025 19:15:17 -0400 (0:00:00.822) 0:26:34.518 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 04 October 2025 19:15:17 -0400 (0:00:00.338) 0:26:34.857 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 04 October 2025 19:15:18 -0400 (0:00:00.247) 0:26:35.104 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 04 October 2025 19:15:18 -0400 (0:00:00.302) 0:26:35.406 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 04 October 2025 19:15:18 -0400 (0:00:00.344) 0:26:35.750 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 04 October 2025 19:15:19 -0400 (0:00:01.028) 0:26:36.779 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 04 October 2025 19:15:20 -0400 (0:00:00.736) 0:26:37.516 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 04 October 2025 19:15:21 -0400 (0:00:00.781) 0:26:38.298 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 04 October 2025 19:15:21 -0400 (0:00:00.601) 0:26:38.900 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 04 October 2025 19:15:22 -0400 (0:00:00.721) 0:26:39.622 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 04 October 2025 19:15:22 -0400 (0:00:00.260) 0:26:39.882 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 04 October 2025 19:15:23 -0400 (0:00:00.777) 0:26:40.660 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 04 October 2025 19:15:24 -0400 (0:00:00.628) 0:26:41.288 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619643.8329194, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759619643.8329194, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2050, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1759619643.8329194, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 04 October 2025 19:15:25 -0400 (0:00:01.344) 0:26:42.633 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 04 October 2025 19:15:25 -0400 (0:00:00.370) 0:26:43.004 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 04 October 2025 19:15:26 -0400 (0:00:00.287) 0:26:43.292 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 04 October 2025 19:15:26 -0400 (0:00:00.341) 0:26:43.634 ****** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 04 October 2025 19:15:26 -0400 (0:00:00.297) 0:26:43.931 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 04 October 2025 19:15:27 -0400 (0:00:00.286) 0:26:44.217 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 04 October 2025 19:15:27 -0400 (0:00:00.400) 0:26:44.618 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619644.0859215, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759619644.0859215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2126, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1759619644.0859215, "nlink": 1, "path": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 04 October 2025 19:15:28 -0400 (0:00:01.385) 0:26:46.004 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 04 October 2025 19:15:31 -0400 (0:00:02.435) 0:26:48.439 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.007588", "end": "2025-10-04 19:15:32.498463", "rc": 0, "start": "2025-10-04 19:15:32.490875" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 3b6d1c56-6095-4eba-8c81-e126b4142eba Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 657352 Threads: 2 Salt: 0d 26 17 f6 59 5a 2b 66 b0 68 ee 47 b0 90 de 0c e8 6c 0d a8 39 c7 a6 c3 9f 53 00 2a 72 19 11 44 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 129774 Salt: 48 e9 a1 84 91 d3 5d 3c 9b 12 2b 48 89 24 9b 9f aa 95 82 23 20 30 0c c6 71 25 11 cd ef 24 3e 51 Digest: 5d f4 2c 1f 0c d7 de 0b e9 ab 2b f9 29 f7 75 91 77 ec 83 0b a0 9a 4d 86 fa af 09 9b f5 84 c0 fc TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 04 October 2025 19:15:32 -0400 (0:00:01.340) 0:26:49.779 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 04 October 2025 19:15:33 -0400 (0:00:00.683) 0:26:50.463 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 04 October 2025 19:15:34 -0400 (0:00:00.710) 0:26:51.173 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 04 October 2025 19:15:34 -0400 (0:00:00.431) 0:26:51.605 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 04 October 2025 19:15:34 -0400 (0:00:00.382) 0:26:51.987 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 04 October 2025 19:15:35 -0400 (0:00:00.975) 0:26:52.963 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 04 October 2025 19:15:36 -0400 (0:00:00.408) 0:26:53.372 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 04 October 2025 19:15:36 -0400 (0:00:00.421) 0:26:53.793 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-3b6d1c56-6095-4eba-8c81-e126b4142eba /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 04 October 2025 19:15:37 -0400 (0:00:00.773) 0:26:54.566 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 04 October 2025 19:15:38 -0400 (0:00:00.763) 0:26:55.330 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 04 October 2025 19:15:39 -0400 (0:00:00.737) 0:26:56.068 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 04 October 2025 19:15:39 -0400 (0:00:00.647) 0:26:56.715 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 04 October 2025 19:15:40 -0400 (0:00:00.692) 0:26:57.408 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 04 October 2025 19:15:40 -0400 (0:00:00.289) 0:26:57.698 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 04 October 2025 19:15:40 -0400 (0:00:00.246) 0:26:57.944 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 04 October 2025 19:15:41 -0400 (0:00:00.260) 0:26:58.205 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 04 October 2025 19:15:41 -0400 (0:00:00.267) 0:26:58.472 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 04 October 2025 19:15:41 -0400 (0:00:00.243) 0:26:58.716 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 04 October 2025 19:15:41 -0400 (0:00:00.275) 0:26:58.991 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 04 October 2025 19:15:42 -0400 (0:00:00.255) 0:26:59.247 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 04 October 2025 19:15:42 -0400 (0:00:00.302) 0:26:59.549 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 04 October 2025 19:15:42 -0400 (0:00:00.340) 0:26:59.890 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 04 October 2025 19:15:43 -0400 (0:00:00.292) 0:27:00.182 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 04 October 2025 19:15:43 -0400 (0:00:00.261) 0:27:00.444 ****** ok: [managed-node8] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 04 October 2025 19:15:44 -0400 (0:00:01.495) 0:27:01.939 ****** ok: [managed-node8] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 04 October 2025 19:15:46 -0400 (0:00:01.598) 0:27:03.538 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 04 October 2025 19:15:47 -0400 (0:00:00.723) 0:27:04.262 ****** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 04 October 2025 19:15:47 -0400 (0:00:00.409) 0:27:04.671 ****** ok: [managed-node8] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 04 October 2025 19:15:49 -0400 (0:00:01.463) 0:27:06.135 ****** skipping: [managed-node8] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 04 October 2025 19:15:49 -0400 (0:00:00.671) 0:27:06.806 ****** skipping: [managed-node8] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 04 October 2025 19:15:50 -0400 (0:00:00.763) 0:27:07.570 ****** skipping: [managed-node8] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 04 October 2025 19:15:51 -0400 (0:00:00.609) 0:27:08.179 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 04 October 2025 19:15:51 -0400 (0:00:00.710) 0:27:08.890 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 04 October 2025 19:15:52 -0400 (0:00:00.811) 0:27:09.702 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 04 October 2025 19:15:53 -0400 (0:00:00.651) 0:27:10.353 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 04 October 2025 19:15:54 -0400 (0:00:00.738) 0:27:11.091 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 04 October 2025 19:15:54 -0400 (0:00:00.655) 0:27:11.747 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 04 October 2025 19:15:55 -0400 (0:00:00.633) 0:27:12.381 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 04 October 2025 19:15:56 -0400 (0:00:00.735) 0:27:13.117 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 04 October 2025 19:15:56 -0400 (0:00:00.739) 0:27:13.856 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 04 October 2025 19:15:57 -0400 (0:00:00.502) 0:27:14.358 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 04 October 2025 19:15:57 -0400 (0:00:00.574) 0:27:14.933 ****** skipping: [managed-node8] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 04 October 2025 19:15:58 -0400 (0:00:00.701) 0:27:15.634 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 04 October 2025 19:15:59 -0400 (0:00:00.712) 0:27:16.347 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 04 October 2025 19:15:59 -0400 (0:00:00.672) 0:27:17.020 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 04 October 2025 19:16:00 -0400 (0:00:00.716) 0:27:17.736 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 04 October 2025 19:16:01 -0400 (0:00:00.630) 0:27:18.366 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 04 October 2025 19:16:01 -0400 (0:00:00.567) 0:27:18.934 ****** ok: [managed-node8] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 04 October 2025 19:16:02 -0400 (0:00:00.423) 0:27:19.358 ****** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 04 October 2025 19:16:02 -0400 (0:00:00.347) 0:27:19.705 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 04 October 2025 19:16:03 -0400 (0:00:00.759) 0:27:20.465 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.029769", "end": "2025-10-04 19:16:04.590742", "rc": 0, "start": "2025-10-04 19:16:04.560973" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 04 October 2025 19:16:04 -0400 (0:00:01.444) 0:27:21.910 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 04 October 2025 19:16:05 -0400 (0:00:00.621) 0:27:22.532 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 04 October 2025 19:16:06 -0400 (0:00:00.722) 0:27:23.255 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 04 October 2025 19:16:06 -0400 (0:00:00.584) 0:27:23.839 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 04 October 2025 19:16:07 -0400 (0:00:00.585) 0:27:24.425 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 04 October 2025 19:16:07 -0400 (0:00:00.569) 0:27:24.995 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 04 October 2025 19:16:08 -0400 (0:00:00.845) 0:27:25.840 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 04 October 2025 19:16:09 -0400 (0:00:00.330) 0:27:26.171 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 04 October 2025 19:16:09 -0400 (0:00:00.570) 0:27:26.742 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:527 Saturday 04 October 2025 19:16:10 -0400 (0:00:00.334) 0:27:27.076 ****** included: fedora.linux_system_roles.storage for managed-node8 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 04 October 2025 19:16:11 -0400 (0:00:01.368) 0:27:28.444 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 04 October 2025 19:16:11 -0400 (0:00:00.583) 0:27:29.028 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 04 October 2025 19:16:12 -0400 (0:00:00.698) 0:27:29.727 ****** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node8] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 04 October 2025 19:16:13 -0400 (0:00:00.873) 0:27:30.600 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 04 October 2025 19:16:13 -0400 (0:00:00.349) 0:27:30.950 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 04 October 2025 19:16:14 -0400 (0:00:00.334) 0:27:31.284 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 04 October 2025 19:16:14 -0400 (0:00:00.323) 0:27:31.608 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 04 October 2025 19:16:14 -0400 (0:00:00.328) 0:27:31.937 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 04 October 2025 19:16:15 -0400 (0:00:00.904) 0:27:32.841 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 04 October 2025 19:16:18 -0400 (0:00:02.427) 0:27:35.269 ****** ok: [managed-node8] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 04 October 2025 19:16:18 -0400 (0:00:00.655) 0:27:35.924 ****** ok: [managed-node8] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 04 October 2025 19:16:19 -0400 (0:00:00.684) 0:27:36.608 ****** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 04 October 2025 19:16:22 -0400 (0:00:02.888) 0:27:39.497 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 04 October 2025 19:16:23 -0400 (0:00:00.686) 0:27:40.184 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 04 October 2025 19:16:23 -0400 (0:00:00.619) 0:27:40.803 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 04 October 2025 19:16:24 -0400 (0:00:00.503) 0:27:41.307 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 04 October 2025 19:16:24 -0400 (0:00:00.595) 0:27:41.902 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 04 October 2025 19:16:29 -0400 (0:00:05.101) 0:27:47.004 ****** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 04 October 2025 19:16:33 -0400 (0:00:03.138) 0:27:50.142 ****** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 04 October 2025 19:16:34 -0400 (0:00:00.950) 0:27:51.093 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 04 October 2025 19:16:34 -0400 (0:00:00.260) 0:27:51.353 ****** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=4dp1Fv-bvao-1UZJ-Uroo-RdD2-Lzni-JhB8EB", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Saturday 04 October 2025 19:16:37 -0400 (0:00:03.414) 0:27:54.768 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Saturday 04 October 2025 19:16:38 -0400 (0:00:00.571) 0:27:55.340 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619655.029017, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "c77559fc7c64efc62d5e00e5a5b6107380a641a6", "ctime": 1759619655.0250168, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 683671747, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1759619655.0250168, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1448, "uid": 0, "version": "2862150521", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Saturday 04 October 2025 19:16:39 -0400 (0:00:01.262) 0:27:56.602 ****** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 04 October 2025 19:16:40 -0400 (0:00:01.435) 0:27:58.037 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Saturday 04 October 2025 19:16:41 -0400 (0:00:00.186) 0:27:58.223 ****** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=4dp1Fv-bvao-1UZJ-Uroo-RdD2-Lzni-JhB8EB", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Saturday 04 October 2025 19:16:41 -0400 (0:00:00.482) 0:27:58.705 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Saturday 04 October 2025 19:16:42 -0400 (0:00:00.370) 0:27:59.076 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=4dp1Fv-bvao-1UZJ-Uroo-RdD2-Lzni-JhB8EB", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Saturday 04 October 2025 19:16:42 -0400 (0:00:00.345) 0:27:59.422 ****** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node8] => (item={'src': '/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-3b6d1c56-6095-4eba-8c81-e126b4142eba" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Saturday 04 October 2025 19:16:44 -0400 (0:00:01.932) 0:28:01.355 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Saturday 04 October 2025 19:16:46 -0400 (0:00:01.940) 0:28:03.295 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Saturday 04 October 2025 19:16:46 -0400 (0:00:00.749) 0:28:04.045 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Saturday 04 October 2025 19:16:47 -0400 (0:00:00.900) 0:28:04.945 ****** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Saturday 04 October 2025 19:16:49 -0400 (0:00:02.003) 0:28:06.949 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619668.9621382, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "61309c052f4924e8eb9bde5be51383618fb3ef4f", "ctime": 1759619660.8470676, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 209715343, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1759619660.8470683, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1756338217", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Saturday 04 October 2025 19:16:51 -0400 (0:00:01.315) 0:28:08.264 ****** changed: [managed-node8] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-3b6d1c56-6095-4eba-8c81-e126b4142eba', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3b6d1c56-6095-4eba-8c81-e126b4142eba", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Saturday 04 October 2025 19:16:53 -0400 (0:00:01.902) 0:28:10.166 ****** ok: [managed-node8] TASK [Verify role results - 11] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:537 Saturday 04 October 2025 19:16:55 -0400 (0:00:02.196) 0:28:12.362 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 04 October 2025 19:16:56 -0400 (0:00:00.843) 0:28:13.206 ****** skipping: [managed-node8] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 04 October 2025 19:16:56 -0400 (0:00:00.599) 0:28:13.806 ****** ok: [managed-node8] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=4dp1Fv-bvao-1UZJ-Uroo-RdD2-Lzni-JhB8EB", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 04 October 2025 19:16:57 -0400 (0:00:00.711) 0:28:14.518 ****** ok: [managed-node8] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "8f91d125-a2a0-4395-9c60-8981f3d516e2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 04 October 2025 19:16:58 -0400 (0:00:01.223) 0:28:15.741 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003234", "end": "2025-10-04 19:16:59.623686", "rc": 0, "start": "2025-10-04 19:16:59.620452" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Sep 30 13:43:10 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=8f91d125-a2a0-4395-9c60-8981f3d516e2 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 04 October 2025 19:16:59 -0400 (0:00:01.177) 0:28:16.919 ****** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003342", "end": "2025-10-04 19:17:00.777049", "failed_when_result": false, "rc": 0, "start": "2025-10-04 19:17:00.773707" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 04 October 2025 19:17:01 -0400 (0:00:01.141) 0:28:18.060 ****** skipping: [managed-node8] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 04 October 2025 19:17:01 -0400 (0:00:00.531) 0:28:18.592 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'lvmpv', 'mount_options': 'defaults', 'mount_point': None, 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'absent', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=4dp1Fv-bvao-1UZJ-Uroo-RdD2-Lzni-JhB8EB'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 04 October 2025 19:17:02 -0400 (0:00:00.871) 0:28:19.464 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 04 October 2025 19:17:03 -0400 (0:00:00.856) 0:28:20.320 ****** included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 => (item=mount) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 => (item=fstab) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 => (item=fs) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 => (item=device) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 => (item=encryption) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 => (item=md) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 => (item=size) included: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 04 October 2025 19:17:04 -0400 (0:00:01.421) 0:28:21.741 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 04 October 2025 19:17:05 -0400 (0:00:00.353) 0:28:22.095 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 04 October 2025 19:17:05 -0400 (0:00:00.651) 0:28:22.746 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 04 October 2025 19:17:06 -0400 (0:00:00.630) 0:28:23.377 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 04 October 2025 19:17:06 -0400 (0:00:00.368) 0:28:23.745 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 04 October 2025 19:17:06 -0400 (0:00:00.239) 0:28:23.985 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 04 October 2025 19:17:07 -0400 (0:00:00.341) 0:28:24.326 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 04 October 2025 19:17:07 -0400 (0:00:00.218) 0:28:24.545 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 04 October 2025 19:17:07 -0400 (0:00:00.292) 0:28:24.837 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 04 October 2025 19:17:08 -0400 (0:00:00.294) 0:28:25.132 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 04 October 2025 19:17:08 -0400 (0:00:00.269) 0:28:25.402 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 04 October 2025 19:17:08 -0400 (0:00:00.258) 0:28:25.660 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 04 October 2025 19:17:09 -0400 (0:00:00.986) 0:28:26.647 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 04 October 2025 19:17:10 -0400 (0:00:00.560) 0:28:27.208 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 04 October 2025 19:17:10 -0400 (0:00:00.669) 0:28:27.877 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 04 October 2025 19:17:11 -0400 (0:00:00.600) 0:28:28.478 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 04 October 2025 19:17:12 -0400 (0:00:00.647) 0:28:29.125 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 04 October 2025 19:17:12 -0400 (0:00:00.235) 0:28:29.361 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 04 October 2025 19:17:12 -0400 (0:00:00.613) 0:28:29.975 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 04 October 2025 19:17:13 -0400 (0:00:00.472) 0:28:30.447 ****** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1759619797.3442566, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1759619797.3442566, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 452, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1759619797.3442566, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 04 October 2025 19:17:14 -0400 (0:00:01.372) 0:28:31.820 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 04 October 2025 19:17:15 -0400 (0:00:00.456) 0:28:32.276 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 04 October 2025 19:17:15 -0400 (0:00:00.307) 0:28:32.584 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 04 October 2025 19:17:15 -0400 (0:00:00.310) 0:28:32.895 ****** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 04 October 2025 19:17:16 -0400 (0:00:00.388) 0:28:33.283 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 04 October 2025 19:17:16 -0400 (0:00:00.265) 0:28:33.549 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 04 October 2025 19:17:16 -0400 (0:00:00.260) 0:28:33.809 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 04 October 2025 19:17:16 -0400 (0:00:00.226) 0:28:34.036 ****** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 04 October 2025 19:17:19 -0400 (0:00:02.439) 0:28:36.475 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 04 October 2025 19:17:19 -0400 (0:00:00.323) 0:28:36.798 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 04 October 2025 19:17:20 -0400 (0:00:00.322) 0:28:37.121 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 04 October 2025 19:17:20 -0400 (0:00:00.338) 0:28:37.460 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 04 October 2025 19:17:20 -0400 (0:00:00.259) 0:28:37.720 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 04 October 2025 19:17:20 -0400 (0:00:00.296) 0:28:38.016 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 04 October 2025 19:17:21 -0400 (0:00:00.310) 0:28:38.327 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 04 October 2025 19:17:21 -0400 (0:00:00.294) 0:28:38.622 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 04 October 2025 19:17:21 -0400 (0:00:00.285) 0:28:38.907 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 04 October 2025 19:17:22 -0400 (0:00:00.773) 0:28:39.681 ****** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 04 October 2025 19:17:23 -0400 (0:00:00.767) 0:28:40.449 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 04 October 2025 19:17:23 -0400 (0:00:00.587) 0:28:41.036 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 04 October 2025 19:17:24 -0400 (0:00:00.699) 0:28:41.735 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 04 October 2025 19:17:25 -0400 (0:00:00.589) 0:28:42.325 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 04 October 2025 19:17:25 -0400 (0:00:00.372) 0:28:42.698 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 04 October 2025 19:17:25 -0400 (0:00:00.318) 0:28:43.016 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 04 October 2025 19:17:26 -0400 (0:00:00.298) 0:28:43.314 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 04 October 2025 19:17:26 -0400 (0:00:00.304) 0:28:43.619 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 04 October 2025 19:17:26 -0400 (0:00:00.265) 0:28:43.885 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 04 October 2025 19:17:27 -0400 (0:00:00.205) 0:28:44.090 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 04 October 2025 19:17:27 -0400 (0:00:00.294) 0:28:44.385 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 04 October 2025 19:17:27 -0400 (0:00:00.291) 0:28:44.677 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 04 October 2025 19:17:27 -0400 (0:00:00.315) 0:28:44.992 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 04 October 2025 19:17:28 -0400 (0:00:00.284) 0:28:45.276 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 04 October 2025 19:17:28 -0400 (0:00:00.292) 0:28:45.569 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 04 October 2025 19:17:29 -0400 (0:00:00.711) 0:28:46.280 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 04 October 2025 19:17:30 -0400 (0:00:00.803) 0:28:47.084 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 04 October 2025 19:17:30 -0400 (0:00:00.528) 0:28:47.613 ****** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 04 October 2025 19:17:30 -0400 (0:00:00.272) 0:28:47.885 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 04 October 2025 19:17:31 -0400 (0:00:00.534) 0:28:48.419 ****** skipping: [managed-node8] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 04 October 2025 19:17:31 -0400 (0:00:00.543) 0:28:48.963 ****** skipping: [managed-node8] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 04 October 2025 19:17:32 -0400 (0:00:00.652) 0:28:49.616 ****** skipping: [managed-node8] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 04 October 2025 19:17:35 -0400 (0:00:03.429) 0:28:53.046 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 04 October 2025 19:17:36 -0400 (0:00:00.474) 0:28:53.520 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 04 October 2025 19:17:36 -0400 (0:00:00.256) 0:28:53.777 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 04 October 2025 19:17:36 -0400 (0:00:00.252) 0:28:54.029 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 04 October 2025 19:17:37 -0400 (0:00:00.326) 0:28:54.356 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 04 October 2025 19:17:37 -0400 (0:00:00.321) 0:28:54.677 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 04 October 2025 19:17:37 -0400 (0:00:00.348) 0:28:55.026 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 04 October 2025 19:17:38 -0400 (0:00:00.334) 0:28:55.361 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 04 October 2025 19:17:38 -0400 (0:00:00.347) 0:28:55.709 ****** skipping: [managed-node8] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 04 October 2025 19:17:38 -0400 (0:00:00.346) 0:28:56.055 ****** skipping: [managed-node8] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 04 October 2025 19:17:39 -0400 (0:00:00.368) 0:28:56.424 ****** skipping: [managed-node8] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 04 October 2025 19:17:39 -0400 (0:00:00.377) 0:28:56.801 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 04 October 2025 19:17:40 -0400 (0:00:00.437) 0:28:57.238 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 04 October 2025 19:17:40 -0400 (0:00:00.374) 0:28:57.612 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 04 October 2025 19:17:40 -0400 (0:00:00.392) 0:28:58.005 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 04 October 2025 19:17:41 -0400 (0:00:00.438) 0:28:58.443 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 04 October 2025 19:17:41 -0400 (0:00:00.379) 0:28:58.823 ****** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 04 October 2025 19:17:42 -0400 (0:00:00.404) 0:28:59.228 ****** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 04 October 2025 19:17:42 -0400 (0:00:00.356) 0:28:59.607 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 04 October 2025 19:17:43 -0400 (0:00:00.652) 0:29:00.260 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 04 October 2025 19:17:43 -0400 (0:00:00.277) 0:29:00.537 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 04 October 2025 19:17:43 -0400 (0:00:00.345) 0:29:00.882 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 04 October 2025 19:17:44 -0400 (0:00:00.313) 0:29:01.195 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 04 October 2025 19:17:44 -0400 (0:00:00.216) 0:29:01.411 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 04 October 2025 19:17:44 -0400 (0:00:00.300) 0:29:01.711 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 04 October 2025 19:17:44 -0400 (0:00:00.312) 0:29:02.024 ****** skipping: [managed-node8] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 04 October 2025 19:17:45 -0400 (0:00:00.332) 0:29:02.356 ****** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 04 October 2025 19:17:45 -0400 (0:00:00.390) 0:29:02.746 ****** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } PLAY RECAP ********************************************************************* managed-node8 : ok=1250 changed=60 unreachable=0 failed=0 skipped=1068 rescued=18 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.17.14", "end_time": "2025-10-04T22:49:54.914878+00:00Z", "host": "managed-node8", "message": "encrypted volume 'foo' missing key/password", "start_time": "2025-10-04T22:49:52.632111+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2025-10-04T22:49:55.260963+00:00Z", "host": "managed-node8", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'foo' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-10-04T22:49:55.030655+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2025-10-04T22:52:09.265214+00:00Z", "host": "managed-node8", "message": "cannot remove existing formatting on device 'luks-7f04d860-5fdb-4936-bbb5-ef0a46440079' in safe mode due to encryption removal", "start_time": "2025-10-04T22:52:07.125846+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2025-10-04T22:52:09.610262+00:00Z", "host": "managed-node8", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-7f04d860-5fdb-4936-bbb5-ef0a46440079' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-10-04T22:52:09.403471+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2025-10-04T22:54:05.160768+00:00Z", "host": "managed-node8", "message": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "start_time": "2025-10-04T22:54:03.042963+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2025-10-04T22:54:05.652194+00:00Z", "host": "managed-node8", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-10-04T22:54:05.338740+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2025-10-04T22:56:19.527292+00:00Z", "host": "managed-node8", "message": "encrypted volume 'test1' missing key/password", "start_time": "2025-10-04T22:56:17.341857+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2025-10-04T22:56:20.056930+00:00Z", "host": "managed-node8", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-10-04T22:56:19.734587+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2025-10-04T22:58:59.975893+00:00Z", "host": "managed-node8", "message": "cannot remove existing formatting on device 'luks-b03103b5-9920-48d2-9376-8a11631c90a5' in safe mode due to encryption removal", "start_time": "2025-10-04T22:58:57.318954+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2025-10-04T22:59:00.432004+00:00Z", "host": "managed-node8", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-b03103b5-9920-48d2-9376-8a11631c90a5' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-10-04T22:59:00.116910+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2025-10-04T23:01:37.884991+00:00Z", "host": "managed-node8", "message": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "start_time": "2025-10-04T23:01:35.593179+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2025-10-04T23:01:38.340745+00:00Z", "host": "managed-node8", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-10-04T23:01:38.024164+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2025-10-04T23:04:28.856083+00:00Z", "host": "managed-node8", "message": "encrypted volume 'test1' missing key/password", "start_time": "2025-10-04T23:04:26.344428+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2025-10-04T23:04:29.320789+00:00Z", "host": "managed-node8", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-10-04T23:04:29.018490+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2025-10-04T23:10:11.165684+00:00Z", "host": "managed-node8", "message": "cannot remove existing formatting on device 'luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b' in safe mode due to encryption removal", "start_time": "2025-10-04T23:10:08.271659+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2025-10-04T23:10:11.662841+00:00Z", "host": "managed-node8", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-5a76e9d8-506a-42b0-8014-dbfe2d14193b' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-10-04T23:10:11.317975+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2025-10-04T23:13:25.478361+00:00Z", "host": "managed-node8", "message": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "start_time": "2025-10-04T23:13:22.942380+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2025-10-04T23:13:25.887171+00:00Z", "host": "managed-node8", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-10-04T23:13:25.625698+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Saturday 04 October 2025 19:17:45 -0400 (0:00:00.255) 0:29:03.002 ****** =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.87s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.83s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.68s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.47s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.91s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.85s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Get service facts ------------------- 5.15s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 fedora.linux_system_roles.storage : Make sure required packages are installed --- 5.12s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 fedora.linux_system_roles.storage : Make sure required packages are installed --- 5.10s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 fedora.linux_system_roles.storage : Get service facts ------------------- 4.82s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab --- 4.79s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Ensure cryptsetup is present -------------------------------------------- 4.77s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Gathering Facts --------------------------------------------------------- 4.75s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:2 fedora.linux_system_roles.storage : Get service facts ------------------- 4.53s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 fedora.linux_system_roles.storage : Get service facts ------------------- 4.18s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 fedora.linux_system_roles.storage : Update facts ------------------------ 4.14s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 fedora.linux_system_roles.storage : Get service facts ------------------- 3.99s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 fedora.linux_system_roles.storage : Get service facts ------------------- 3.91s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Get the canonical device path for each member device -------------------- 3.81s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 fedora.linux_system_roles.storage : Make sure blivet is available ------- 3.75s /tmp/collections-QnR/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2